Apr 22 17:53:38.405165 ip-10-0-143-11 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:38.838576 ip-10-0-143-11 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.838576 ip-10-0-143-11 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:38.838576 ip-10-0-143-11 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.838576 ip-10-0-143-11 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:38.838576 ip-10-0-143-11 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.840308 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.840146 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:38.844757 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844736 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.844757 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844753 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.844757 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844756 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.844757 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844760 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.844757 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844763 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844767 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844770 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844772 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844775 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844778 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844780 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844783 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844786 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844790 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844792 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844795 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844798 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844801 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844804 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844807 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844809 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844812 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844814 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.844971 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844817 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844820 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844823 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844827 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844830 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844833 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844843 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844846 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844849 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844852 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844855 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844870 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844873 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844875 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844878 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844882 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844885 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844888 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844890 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.845428 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844893 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844897 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844902 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844906 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844909 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844911 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844914 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844917 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844920 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844923 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844925 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844928 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844931 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844934 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844936 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844939 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844941 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844944 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844947 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844950 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.845917 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844953 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844955 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844958 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844960 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844963 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844966 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844968 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844971 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844975 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844978 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844981 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844984 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844987 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844989 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844993 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844995 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.844998 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845001 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845004 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845006 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.846420 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845009 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845014 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845016 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845019 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845434 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845439 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845442 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845445 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845447 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845450 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845453 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845455 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845458 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845461 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845463 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845466 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845469 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845471 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845474 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.846926 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845476 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845481 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845484 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845487 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845490 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845492 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845495 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845498 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845501 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845504 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845506 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845509 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845511 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845514 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845516 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845519 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845521 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845524 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845527 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845529 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.847385 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845531 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845534 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845537 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845539 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845542 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845545 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845548 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845551 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845553 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845556 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845559 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845562 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845570 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845573 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845576 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845578 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845581 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845583 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845586 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845588 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.847908 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845591 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845594 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845597 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845601 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845604 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845607 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845609 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845612 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845615 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845617 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845620 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845622 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845625 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845627 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845630 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845632 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845635 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845640 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845643 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.848405 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845647 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845650 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845653 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845656 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845659 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845662 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845664 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845667 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845670 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845673 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845676 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.845679 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.846997 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847011 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847020 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847025 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847030 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847033 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847037 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847042 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847045 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:38.848889 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847048 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847052 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847055 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847058 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847061 2564 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847064 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847067 2564 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847070 2564 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847073 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847076 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847081 2564 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847084 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847087 2564 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847090 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847093 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847098 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847101 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847104 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847107 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847111 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847114 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847116 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847120 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847123 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847128 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:38.849401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847131 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847135 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847137 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847141 2564 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847143 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847148 2564 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847151 2564 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847154 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847156 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847159 2564 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847163 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847166 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847169 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847172 2564 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847175 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847177 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847181 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847184 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847187 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847190 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847193 2564 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847197 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847200 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847203 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847206 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847209 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:38.850044 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847212 2564 flags.go:64] FLAG: --help="false" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847215 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-143-11.ec2.internal" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847218 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847221 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847224 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847228 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847232 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847235 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847238 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847241 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847244 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847247 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847250 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847253 2564 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847256 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847259 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847262 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847265 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847268 2564 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847270 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847273 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847276 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847282 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:38.850671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847285 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847288 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847291 2564 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847294 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847297 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847300 2564 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847303 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847307 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847310 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847314 2564 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847317 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847320 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847323 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847326 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847329 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847333 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847336 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847344 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847349 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847352 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847355 2564 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847358 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847364 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847367 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:38.851284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847370 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847373 2564 flags.go:64] FLAG: --port="10250" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847376 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847379 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-001eba6949567b988" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847383 2564 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847386 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847389 2564 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847392 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847396 2564 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847400 2564 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847403 2564 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847406 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847409 2564 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847413 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847416 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847419 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847421 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847424 2564 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847427 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847430 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847433 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847436 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847439 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847442 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847445 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847448 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:38.851930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847451 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847455 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847458 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847461 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847464 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847467 2564 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847470 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847475 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847478 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847481 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847485 2564 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847488 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847491 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847494 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847501 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847504 2564 flags.go:64] FLAG: --v="2" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847508 2564 flags.go:64] FLAG: --version="false" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847512 2564 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847517 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.847520 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847619 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847622 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847626 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847629 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.852602 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847632 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847635 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847637 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847641 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847645 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847648 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847651 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847654 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847657 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847659 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847663 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847665 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847668 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847671 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847673 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847676 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847678 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847681 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847684 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847687 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.853215 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847689 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847692 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847696 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847698 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847701 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847704 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847706 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847709 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847712 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847715 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847719 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847722 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847725 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847728 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847731 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847735 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847739 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847742 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847744 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.853715 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847747 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847750 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847753 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847755 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847759 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847762 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847766 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847770 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847774 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847777 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847781 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847785 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847789 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847795 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847801 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847808 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847814 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847818 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847822 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.854255 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847827 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847831 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847836 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847840 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847845 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847849 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847853 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847871 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847876 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847881 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847887 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847892 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847896 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847900 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847905 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847909 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847914 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847918 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847923 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847927 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.854726 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847932 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.855245 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847936 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.855245 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847941 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.855245 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.847945 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.855245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.848683 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.855486 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.855467 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:38.855522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.855489 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:38.855550 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855545 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.855550 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855550 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855553 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855558 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855563 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855570 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855575 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855578 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855581 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855584 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855587 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855590 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855593 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855596 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855598 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855601 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855604 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855607 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.855604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855610 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855613 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855616 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855619 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855622 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855625 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855628 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855631 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855634 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855638 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855643 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855647 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855650 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855653 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855656 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855658 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855660 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855663 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855666 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855668 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.856040 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855671 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855673 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855676 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855678 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855681 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855684 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855686 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855689 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855691 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855694 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855696 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855699 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855701 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855703 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855707 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855710 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855713 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855718 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855726 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855729 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.856552 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855732 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855736 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855739 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855742 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855745 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855747 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855750 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855752 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855755 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855757 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855760 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855763 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855765 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855768 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855770 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855774 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855777 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855779 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855782 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855784 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.857083 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855787 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855791 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855795 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855799 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855803 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855805 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855809 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855812 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.855817 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855938 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855946 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855952 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855956 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855960 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855962 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.857565 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855965 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855968 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855970 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855973 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855976 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855978 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855981 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855983 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855986 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855988 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855991 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855993 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855996 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.855998 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856001 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856003 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856006 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856008 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856011 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856014 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.857958 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856016 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856020 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856025 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856029 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856035 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856038 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856041 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856043 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856046 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856049 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856052 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856055 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856058 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856060 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856063 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856065 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856068 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856070 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856073 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856076 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.858455 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856078 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856081 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856083 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856086 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856088 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856090 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856093 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856097 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856101 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856105 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856109 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856112 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856115 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856117 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856120 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856123 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856125 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856128 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856130 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856133 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.858976 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856135 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856138 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856142 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856144 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856148 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856152 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856155 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856157 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856160 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856163 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856166 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856168 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856171 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856174 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856179 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856185 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856190 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856192 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856195 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.859461 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:38.856197 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.859929 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.856202 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.859929 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.856876 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:38.860539 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.860526 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:38.861418 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.861407 2564 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:38.861517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.861501 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:38.862312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.862301 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:38.887386 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.887365 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:38.890034 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.890011 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:38.903582 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.903559 2564 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:38.909428 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.909410 2564 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:38.912297 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.912277 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:38.915687 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.915666 2564 fs.go:135] Filesystem UUIDs: map[067e4c53-efee-49fa-bb9f-a7cbb7644310:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c4121459-461e-43d2-b8db-e48021d7528a:/dev/nvme0n1p3] Apr 22 17:53:38.915774 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.915685 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:38.921489 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.921376 2564 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:38.919395207 +0000 UTC m=+0.397918620 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3059446 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bcaff9c9b7a6bd60e70ff35a7caa2 SystemUUID:ec2bcaff-9c9b-7a6b-d60e-70ff35a7caa2 BootID:5a378193-2c49-4b30-8212-df9c0d2ee666 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:dd:53:9d:11:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:dd:53:9d:11:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:f7:f2:53:f3:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:38.921489 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.921480 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:38.921621 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.921564 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:38.922004 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.921985 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:38.925700 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925679 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:38.925839 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925701 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-11.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:38.925901 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925847 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:38.925901 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925856 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:38.925901 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925889 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:38.926004 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.925904 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:38.926708 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.926698 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:38.926813 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.926805 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:38.929798 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.929787 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:38.929837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.929802 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:38.929837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.929817 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:38.929837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.929828 2564 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:38.929837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.929838 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:38.931068 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.931056 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:38.931109 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.931074 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:38.934278 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.934262 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:38.935513 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.935499 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:38.937084 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937071 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937088 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937094 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937100 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937106 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937132 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:38.937154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937141 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:38.937312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937169 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:38.937312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937180 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:38.937312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937190 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:38.937312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937210 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:38.937312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.937222 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:38.938167 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.938152 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:38.938199 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.938174 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:38.941726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.941706 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-11.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:53:38.941798 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.941750 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:38.941798 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.941770 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:38.942121 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.942109 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:38.942152 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.942146 2564 server.go:1295] "Started kubelet" Apr 22 17:53:38.942247 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.942215 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:38.942334 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.942285 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:38.942373 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.942360 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:38.943046 ip-10-0-143-11 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:38.943651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.943460 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:38.943651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.943604 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:38.948711 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.948679 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:38.949069 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.949032 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:38.950105 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.950081 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:38.951853 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.950113 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:38.951853 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.950137 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:38.952002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.951884 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:38.952002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.951798 2564 factory.go:55] Registering systemd factory Apr 22 17:53:38.952092 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952055 2564 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:38.952146 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952107 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:38.952146 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952118 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:38.952146 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.950401 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-11.ec2.internal.18a8bf53a36471dd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-11.ec2.internal,UID:ip-10-0-143-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-11.ec2.internal,},FirstTimestamp:2026-04-22 17:53:38.942120413 +0000 UTC m=+0.420643830,LastTimestamp:2026-04-22 17:53:38.942120413 +0000 UTC m=+0.420643830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-11.ec2.internal,}" Apr 22 17:53:38.952686 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952665 2564 factory.go:153] Registering CRI-O factory Apr 22 17:53:38.952686 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952682 2564 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:38.952785 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952738 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:38.952785 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952760 2564 factory.go:103] Registering Raw factory Apr 22 17:53:38.952785 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.952777 2564 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:38.953204 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.953184 2564 manager.go:319] Starting recovery of all containers Apr 22 17:53:38.961388 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.955923 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:38.961388 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.957975 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:53:38.961388 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.958171 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:53:38.961388 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.960656 2564 manager.go:324] Recovery completed Apr 22 17:53:38.965117 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.965104 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.967376 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967356 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h9f6s" Apr 22 17:53:38.967453 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967375 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.967453 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967405 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.967453 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967416 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.967847 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967833 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:38.967847 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967847 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:38.967944 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.967893 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:38.969472 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.969395 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-11.ec2.internal.18a8bf53a4e602be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-11.ec2.internal,UID:ip-10-0-143-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-11.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-11.ec2.internal,},FirstTimestamp:2026-04-22 17:53:38.967388862 +0000 UTC m=+0.445912286,LastTimestamp:2026-04-22 17:53:38.967388862 +0000 UTC m=+0.445912286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-11.ec2.internal,}" Apr 22 17:53:38.970340 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.970302 2564 policy_none.go:49] "None policy: Start" Apr 22 17:53:38.970340 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.970322 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:38.970340 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.970336 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:38.975815 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:38.975802 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h9f6s" Apr 22 17:53:38.978598 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:38.978537 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-11.ec2.internal.18a8bf53a4e656c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-11.ec2.internal,UID:ip-10-0-143-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-11.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-11.ec2.internal,},FirstTimestamp:2026-04-22 17:53:38.967410373 +0000 UTC m=+0.445933790,LastTimestamp:2026-04-22 17:53:38.967410373 +0000 UTC m=+0.445933790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-11.ec2.internal,}" Apr 22 17:53:39.009568 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.009551 2564 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.009582 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.009593 2564 server.go:85] "Starting device plugin registration server" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.009889 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.009906 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.010053 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.010126 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.010135 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.010614 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:39.021229 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.010647 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.051450 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.051408 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:39.052607 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.052591 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:39.052710 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.052615 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:39.052710 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.052632 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:39.052710 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.052638 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:39.052710 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.052668 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:39.055446 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.055430 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:39.110836 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.110773 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:39.111928 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.111912 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:39.112011 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.111942 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:39.112011 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.111958 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:39.112011 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.111984 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.121741 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.121726 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.121806 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.121751 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-11.ec2.internal\": node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.134796 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.134776 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.153204 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.153164 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal"] Apr 22 17:53:39.153282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.153241 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:39.154721 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.154704 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:39.154795 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.154730 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:39.154795 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.154742 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:39.155995 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.155984 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:39.156153 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156138 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.156190 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156168 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:39.156711 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156691 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:39.156759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156721 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:39.156759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156697 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:39.156759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156753 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:39.156849 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156763 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:39.156849 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.156733 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:39.158046 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.158030 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.158099 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.158060 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:39.158721 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.158704 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:39.158797 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.158734 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:39.158797 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.158748 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:39.184609 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.184586 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-11.ec2.internal\" not found" node="ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.189055 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.189039 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-11.ec2.internal\" not found" node="ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.235710 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.235682 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.253685 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.253657 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.253685 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.253687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.253841 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.253707 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdf6ed0045bf81395f30896fa82f74ae-config\") pod \"kube-apiserver-proxy-ip-10-0-143-11.ec2.internal\" (UID: \"fdf6ed0045bf81395f30896fa82f74ae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.336821 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.336790 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.354311 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdf6ed0045bf81395f30896fa82f74ae-config\") pod \"kube-apiserver-proxy-ip-10-0-143-11.ec2.internal\" (UID: \"fdf6ed0045bf81395f30896fa82f74ae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.354398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354315 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.354398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354334 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.354474 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354403 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdf6ed0045bf81395f30896fa82f74ae-config\") pod \"kube-apiserver-proxy-ip-10-0-143-11.ec2.internal\" (UID: \"fdf6ed0045bf81395f30896fa82f74ae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.354474 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354410 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.354474 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.354446 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc4d4c20e132e1722348f099c5424474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal\" (UID: \"fc4d4c20e132e1722348f099c5424474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.437763 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.437701 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.486338 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.486303 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.491039 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.491019 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.538779 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.538747 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.639384 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.639351 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.740142 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.740039 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-11.ec2.internal\" not found" Apr 22 17:53:39.835132 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.835102 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:39.850175 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.850149 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.861598 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.861567 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:39.861739 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.861687 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:39.861739 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.861720 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:39.861814 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.861746 2564 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a422dbee3ef56468a9239c0d3b89ff56-a52af8bcb005d4b5.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.143.11:60760->52.201.189.205:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.861814 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.861767 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" Apr 22 17:53:39.878662 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.878635 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:39.930314 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.930284 2564 apiserver.go:52] "Watching apiserver" Apr 22 17:53:39.940311 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.940289 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:39.941325 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.941304 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wcgxk","openshift-network-diagnostics/network-check-target-rx62v","openshift-ovn-kubernetes/ovnkube-node-8zxzp","openshift-cluster-node-tuning-operator/tuned-tr9fb","openshift-multus/multus-additional-cni-plugins-rdt5n","openshift-multus/multus-ctdsd","openshift-network-operator/iptables-alerter-dwg7z","kube-system/konnectivity-agent-fgqcv","kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw","openshift-image-registry/node-ca-d9mrk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal"] Apr 22 17:53:39.942713 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.942694 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:39.942802 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.942755 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:39.943788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.943774 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:39.943837 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:39.943826 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:39.944948 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.944928 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.947835 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.947228 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.947835 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.947636 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.947835 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.947756 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.948057 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.947930 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:39.948057 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948048 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:39.948147 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948130 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-twsbx\"" Apr 22 17:53:39.948716 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948698 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.948833 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948768 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:39.948833 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948780 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:39.948833 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.948794 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:39.949589 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.949572 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nqhxk\"" Apr 22 17:53:39.949820 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.949801 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.949912 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.949821 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.950059 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.950045 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.951197 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.951177 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:39.951292 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.951211 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdwr4\"" Apr 22 17:53:39.951292 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.951280 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:39.952781 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.952468 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.952781 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.952562 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:39.952781 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.952745 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-scn75\"" Apr 22 17:53:39.952781 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.952771 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:39.953056 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.953011 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.953140 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.953118 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:39.954002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.953951 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.954002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.953984 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:39.954917 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.954901 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kkmxm\"" Apr 22 17:53:39.955356 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.955342 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:39.955413 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.955342 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.955922 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.955905 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:39.957169 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.957151 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:39.957268 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.957170 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:39.957268 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.957215 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p7w9b\"" Apr 22 17:53:39.957268 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.957249 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:39.958263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958248 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:39.958448 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958433 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.958536 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958522 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.958621 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-host\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.958675 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958629 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-tmp\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.958675 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958639 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-l5nc5\"" Apr 22 17:53:39.958675 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958644 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-conf\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.958675 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-lib-modules\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958696 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc9de976-617d-407c-9074-f0ad44c2518d-host-slash\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958725 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-etc-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958749 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-bin\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958782 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-etc-kubernetes\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958802 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:39.958825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958818 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-netns\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-cni-binary-copy\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958903 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-agent-certs\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-node-log\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958952 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-systemd-units\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.958975 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-script-lib\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959014 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-daemon-config\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959052 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.959097 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959085 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-var-lib-kubelet\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959108 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-slash\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959130 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-systemd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-var-lib-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959181 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9v7\" (UniqueName: \"kubernetes.io/projected/75123953-ef56-489a-8b07-e5d0a129fad3-kube-api-access-xn9v7\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959207 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cnibin\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959231 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2c4\" (UniqueName: \"kubernetes.io/projected/b86c7b56-b95e-4b34-8a02-a7cbb80decae-kube-api-access-jm2c4\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959258 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-systemd\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959283 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-tuned\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-cnibin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959331 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959338 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-kubernetes\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdmh\" (UniqueName: \"kubernetes.io/projected/bcc515e3-814e-41a4-9bbe-dc0050efd02c-kube-api-access-xcdmh\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959385 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959410 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959459 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-kubelet\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.959517 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-conf-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959508 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-ovn\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959543 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-modprobe-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959550 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959577 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959598 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-sys\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959609 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959621 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-env-overrides\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959553 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btfrc\"" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959658 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-netns\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959705 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-bin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959727 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-multus\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959765 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7htf\" (UniqueName: \"kubernetes.io/projected/2c1e2467-3796-47ad-928c-f82f435261e9-kube-api-access-t7htf\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959790 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-kubelet\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959811 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-log-socket\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959839 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-netd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959886 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-konnectivity-ca\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:39.960250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959905 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-config\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959920 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-system-cni-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959935 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-os-release\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959949 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-os-release\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.959994 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysconfig\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960022 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dc9de976-617d-407c-9074-f0ad44c2518d-iptables-alerter-script\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960039 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkbd\" (UniqueName: \"kubernetes.io/projected/dc9de976-617d-407c-9074-f0ad44c2518d-kube-api-access-mhkbd\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960054 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960068 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960095 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960110 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75123953-ef56-489a-8b07-e5d0a129fad3-ovn-node-metrics-cert\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960125 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-system-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960169 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-socket-dir-parent\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960197 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-k8s-cni-cncf-io\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-hostroot\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9x7\" (UniqueName: \"kubernetes.io/projected/d41f25be-a0c4-4095-99e5-f6190accf5a8-kube-api-access-4p9x7\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.960788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-run\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:39.961325 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:39.961325 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.960317 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-multus-certs\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:39.967201 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.967183 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:39.977792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.977767 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:38 +0000 UTC" deadline="2028-02-05 00:08:07.587159612 +0000 UTC" Apr 22 17:53:39.977792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.977791 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15678h14m27.609372346s" Apr 22 17:53:39.984828 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.984806 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l2mtr" Apr 22 17:53:39.995310 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:39.995292 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l2mtr" Apr 22 17:53:40.004225 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.004197 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4d4c20e132e1722348f099c5424474.slice/crio-1a76f3f253f8f871c1b2c7466b45e7182b3b10419c5ec69511620189aa0130d9 WatchSource:0}: Error finding container 1a76f3f253f8f871c1b2c7466b45e7182b3b10419c5ec69511620189aa0130d9: Status 404 returned error can't find the container with id 1a76f3f253f8f871c1b2c7466b45e7182b3b10419c5ec69511620189aa0130d9 Apr 22 17:53:40.004484 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.004464 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf6ed0045bf81395f30896fa82f74ae.slice/crio-dfb99b870b2c4548878bf5a946708943e8e188006d68849aa503e0007e032c11 WatchSource:0}: Error finding container dfb99b870b2c4548878bf5a946708943e8e188006d68849aa503e0007e032c11: Status 404 returned error can't find the container with id dfb99b870b2c4548878bf5a946708943e8e188006d68849aa503e0007e032c11 Apr 22 17:53:40.008494 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.008478 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:40.052812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.052786 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:40.055074 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.055028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" event={"ID":"fdf6ed0045bf81395f30896fa82f74ae","Type":"ContainerStarted","Data":"dfb99b870b2c4548878bf5a946708943e8e188006d68849aa503e0007e032c11"} Apr 22 17:53:40.056021 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.056000 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" event={"ID":"fc4d4c20e132e1722348f099c5424474","Type":"ContainerStarted","Data":"1a76f3f253f8f871c1b2c7466b45e7182b3b10419c5ec69511620189aa0130d9"} Apr 22 17:53:40.061448 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-netns\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061507 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061456 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-cni-binary-copy\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.061507 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061471 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-agent-certs\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:40.061507 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-node-log\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061600 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061544 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-netns\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061706 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-node-log\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.061788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-systemd-units\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-script-lib\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061788 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061782 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-daemon-config\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061796 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061810 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-var-lib-kubelet\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061875 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-var-lib-kubelet\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-systemd-units\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061889 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061907 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-slash\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061944 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-systemd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061927 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061991 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-systemd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.061993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.061991 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-slash\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-var-lib-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062054 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9v7\" (UniqueName: \"kubernetes.io/projected/75123953-ef56-489a-8b07-e5d0a129fad3-kube-api-access-xn9v7\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062080 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cnibin\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062085 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-var-lib-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2c4\" (UniqueName: \"kubernetes.io/projected/b86c7b56-b95e-4b34-8a02-a7cbb80decae-kube-api-access-jm2c4\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062138 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e27bc8-4e3e-4655-9e83-8aed674d5e93-host\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cnibin\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062164 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-systemd\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-tuned\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-cni-binary-copy\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062330 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-cnibin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062348 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-systemd\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062380 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-etc-selinux\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062395 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-cnibin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.062398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062407 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-kubernetes\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062394 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-script-lib\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-kubernetes\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062432 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdmh\" (UniqueName: \"kubernetes.io/projected/bcc515e3-814e-41a4-9bbe-dc0050efd02c-kube-api-access-xcdmh\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062525 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062553 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062568 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-kubelet\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062606 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-conf-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062611 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-daemon-config\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062630 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-ovn\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062658 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-device-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062682 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-conf-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062684 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-modprobe-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062698 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-ovn\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062701 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.063030 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062660 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-kubelet\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-sys\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062772 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-modprobe-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-env-overrides\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062837 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-d\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062841 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-sys\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062905 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-netns\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062932 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-bin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062955 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-multus\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062976 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062983 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-netns\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.062997 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7htf\" (UniqueName: \"kubernetes.io/projected/2c1e2467-3796-47ad-928c-f82f435261e9-kube-api-access-t7htf\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063016 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-bin\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063029 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-var-lib-cni-multus\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063021 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-kubelet\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063053 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063061 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-log-socket\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.063801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063100 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-kubelet\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-netd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063130 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-konnectivity-ca\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e27bc8-4e3e-4655-9e83-8aed674d5e93-serviceca\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063165 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-netd\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063181 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr7s\" (UniqueName: \"kubernetes.io/projected/58e27bc8-4e3e-4655-9e83-8aed674d5e93-kube-api-access-wgr7s\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063204 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-env-overrides\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-config\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063217 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-log-socket\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-system-cni-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-os-release\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-system-cni-dir\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063324 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-os-release\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063354 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vht\" (UniqueName: \"kubernetes.io/projected/49c92d77-2103-47f4-a13f-cf6f14fa5779-kube-api-access-p8vht\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063393 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b86c7b56-b95e-4b34-8a02-a7cbb80decae-os-release\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-os-release\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysconfig\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.064651 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063436 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dc9de976-617d-407c-9074-f0ad44c2518d-iptables-alerter-script\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysconfig\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063462 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkbd\" (UniqueName: \"kubernetes.io/projected/dc9de976-617d-407c-9074-f0ad44c2518d-kube-api-access-mhkbd\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063490 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75123953-ef56-489a-8b07-e5d0a129fad3-ovn-node-metrics-cert\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-system-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063614 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-socket-dir-parent\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063628 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/75123953-ef56-489a-8b07-e5d0a129fad3-ovnkube-config\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063641 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-k8s-cni-cncf-io\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063665 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-hostroot\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063688 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9x7\" (UniqueName: \"kubernetes.io/projected/d41f25be-a0c4-4095-99e5-f6190accf5a8-kube-api-access-4p9x7\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-socket-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063746 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-registration-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-run\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-k8s-cni-cncf-io\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.065317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063799 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063820 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-run-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-multus-certs\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063892 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-sys-fs\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-system-cni-dir\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-host\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-tmp\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063948 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-conf\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063953 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-multus-socket-dir-parent\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-lib-modules\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063969 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dc9de976-617d-407c-9074-f0ad44c2518d-iptables-alerter-script\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc9de976-617d-407c-9074-f0ad44c2518d-host-slash\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.063747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-konnectivity-ca\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064050 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-hostroot\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-lib-modules\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064098 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-etc-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-bin\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-etc-kubernetes\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064194 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064247 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-run\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064249 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-etc-openvswitch\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064306 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75123953-ef56-489a-8b07-e5d0a129fad3-host-cni-bin\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.064311 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-etc-kubernetes\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064364 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-host\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-sysctl-conf\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.064420 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.564388489 +0000 UTC m=+2.042911901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc9de976-617d-407c-9074-f0ad44c2518d-host-slash\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.064553 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d41f25be-a0c4-4095-99e5-f6190accf5a8-host-run-multus-certs\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.065295 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-etc-tuned\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.065339 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b86c7b56-b95e-4b34-8a02-a7cbb80decae-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.065836 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8a12c58d-7667-4b16-8b5b-9fb5d4f10530-agent-certs\") pod \"konnectivity-agent-fgqcv\" (UID: \"8a12c58d-7667-4b16-8b5b-9fb5d4f10530\") " pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.066254 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/75123953-ef56-489a-8b07-e5d0a129fad3-ovn-node-metrics-cert\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.066981 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.066488 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcc515e3-814e-41a4-9bbe-dc0050efd02c-tmp\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.084027 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.084007 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:40.084027 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.084025 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:40.084161 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.084035 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.084161 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.084092 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.584078802 +0000 UTC m=+2.062602215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.086313 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.086289 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9x7\" (UniqueName: \"kubernetes.io/projected/d41f25be-a0c4-4095-99e5-f6190accf5a8-kube-api-access-4p9x7\") pod \"multus-ctdsd\" (UID: \"d41f25be-a0c4-4095-99e5-f6190accf5a8\") " pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.086426 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.086322 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7htf\" (UniqueName: \"kubernetes.io/projected/2c1e2467-3796-47ad-928c-f82f435261e9-kube-api-access-t7htf\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:40.086426 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.086367 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdmh\" (UniqueName: \"kubernetes.io/projected/bcc515e3-814e-41a4-9bbe-dc0050efd02c-kube-api-access-xcdmh\") pod \"tuned-tr9fb\" (UID: \"bcc515e3-814e-41a4-9bbe-dc0050efd02c\") " pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.086426 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.086368 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2c4\" (UniqueName: \"kubernetes.io/projected/b86c7b56-b95e-4b34-8a02-a7cbb80decae-kube-api-access-jm2c4\") pod \"multus-additional-cni-plugins-rdt5n\" (UID: \"b86c7b56-b95e-4b34-8a02-a7cbb80decae\") " pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.086587 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.086567 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9v7\" (UniqueName: \"kubernetes.io/projected/75123953-ef56-489a-8b07-e5d0a129fad3-kube-api-access-xn9v7\") pod \"ovnkube-node-8zxzp\" (UID: \"75123953-ef56-489a-8b07-e5d0a129fad3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.087136 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.087117 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkbd\" (UniqueName: \"kubernetes.io/projected/dc9de976-617d-407c-9074-f0ad44c2518d-kube-api-access-mhkbd\") pod \"iptables-alerter-dwg7z\" (UID: \"dc9de976-617d-407c-9074-f0ad44c2518d\") " pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.106923 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.106895 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:40.164497 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164466 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e27bc8-4e3e-4655-9e83-8aed674d5e93-host\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164529 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-etc-selinux\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164584 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164605 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-device-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164636 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-device-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e27bc8-4e3e-4655-9e83-8aed674d5e93-host\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164648 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-etc-selinux\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.164665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e27bc8-4e3e-4655-9e83-8aed674d5e93-serviceca\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164692 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr7s\" (UniqueName: \"kubernetes.io/projected/58e27bc8-4e3e-4655-9e83-8aed674d5e93-kube-api-access-wgr7s\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vht\" (UniqueName: \"kubernetes.io/projected/49c92d77-2103-47f4-a13f-cf6f14fa5779-kube-api-access-p8vht\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164756 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-socket-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164780 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-registration-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-sys-fs\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164883 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-registration-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-sys-fs\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.164920 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49c92d77-2103-47f4-a13f-cf6f14fa5779-socket-dir\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.165284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.165025 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e27bc8-4e3e-4655-9e83-8aed674d5e93-serviceca\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.175267 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.175240 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vht\" (UniqueName: \"kubernetes.io/projected/49c92d77-2103-47f4-a13f-cf6f14fa5779-kube-api-access-p8vht\") pod \"aws-ebs-csi-driver-node-s8bbw\" (UID: \"49c92d77-2103-47f4-a13f-cf6f14fa5779\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.175367 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.175240 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr7s\" (UniqueName: \"kubernetes.io/projected/58e27bc8-4e3e-4655-9e83-8aed674d5e93-kube-api-access-wgr7s\") pod \"node-ca-d9mrk\" (UID: \"58e27bc8-4e3e-4655-9e83-8aed674d5e93\") " pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.272657 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.272571 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:53:40.279771 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.279743 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75123953_ef56_489a_8b07_e5d0a129fad3.slice/crio-34c52504ae27b9c1195ea54ca2b73bce37eb729385c45e1152d590bbdfebc66c WatchSource:0}: Error finding container 34c52504ae27b9c1195ea54ca2b73bce37eb729385c45e1152d590bbdfebc66c: Status 404 returned error can't find the container with id 34c52504ae27b9c1195ea54ca2b73bce37eb729385c45e1152d590bbdfebc66c Apr 22 17:53:40.285632 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.285615 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" Apr 22 17:53:40.291191 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.291165 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc515e3_814e_41a4_9bbe_dc0050efd02c.slice/crio-ba37e9b3d536dc3845b34325f166f8e7d395cdce8f8d72497ba0f01beed13161 WatchSource:0}: Error finding container ba37e9b3d536dc3845b34325f166f8e7d395cdce8f8d72497ba0f01beed13161: Status 404 returned error can't find the container with id ba37e9b3d536dc3845b34325f166f8e7d395cdce8f8d72497ba0f01beed13161 Apr 22 17:53:40.312391 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.312364 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" Apr 22 17:53:40.313008 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.312993 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:40.317305 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.317288 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ctdsd" Apr 22 17:53:40.319492 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.319465 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86c7b56_b95e_4b34_8a02_a7cbb80decae.slice/crio-a82b525dad292b091dc0a5db403112a824bd6ae78094681c2b9d5580c8967d56 WatchSource:0}: Error finding container a82b525dad292b091dc0a5db403112a824bd6ae78094681c2b9d5580c8967d56: Status 404 returned error can't find the container with id a82b525dad292b091dc0a5db403112a824bd6ae78094681c2b9d5580c8967d56 Apr 22 17:53:40.323291 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.323271 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwg7z" Apr 22 17:53:40.325133 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.325112 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41f25be_a0c4_4095_99e5_f6190accf5a8.slice/crio-66b1ac5c0f9887bab28fd1ceb4b09390db901a8f94670ae0bb252ce234886ce5 WatchSource:0}: Error finding container 66b1ac5c0f9887bab28fd1ceb4b09390db901a8f94670ae0bb252ce234886ce5: Status 404 returned error can't find the container with id 66b1ac5c0f9887bab28fd1ceb4b09390db901a8f94670ae0bb252ce234886ce5 Apr 22 17:53:40.329137 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.329117 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:53:40.329919 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.329898 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9de976_617d_407c_9074_f0ad44c2518d.slice/crio-6676ae84e108f3fea0037026271658814c47bad0c32e0de5bb6d4e130d962ec2 WatchSource:0}: Error finding container 6676ae84e108f3fea0037026271658814c47bad0c32e0de5bb6d4e130d962ec2: Status 404 returned error can't find the container with id 6676ae84e108f3fea0037026271658814c47bad0c32e0de5bb6d4e130d962ec2 Apr 22 17:53:40.333743 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.333716 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" Apr 22 17:53:40.335604 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.335583 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a12c58d_7667_4b16_8b5b_9fb5d4f10530.slice/crio-3efc08e1a47d109d423695bb19b6d9fb75ecde64f4981265d7f8d7fb4291eb81 WatchSource:0}: Error finding container 3efc08e1a47d109d423695bb19b6d9fb75ecde64f4981265d7f8d7fb4291eb81: Status 404 returned error can't find the container with id 3efc08e1a47d109d423695bb19b6d9fb75ecde64f4981265d7f8d7fb4291eb81 Apr 22 17:53:40.340053 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.340037 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d9mrk" Apr 22 17:53:40.340326 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.340305 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49c92d77_2103_47f4_a13f_cf6f14fa5779.slice/crio-f2205b94c20a16cd97e91b20f8a558e9bd8f7764d6f82687ce2f28c1dcfaf0a8 WatchSource:0}: Error finding container f2205b94c20a16cd97e91b20f8a558e9bd8f7764d6f82687ce2f28c1dcfaf0a8: Status 404 returned error can't find the container with id f2205b94c20a16cd97e91b20f8a558e9bd8f7764d6f82687ce2f28c1dcfaf0a8 Apr 22 17:53:40.346403 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:53:40.346379 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e27bc8_4e3e_4655_9e83_8aed674d5e93.slice/crio-99983bfea463edd729019652e34fc34fe200d94fcd1c051e427c3fc275bc74b2 WatchSource:0}: Error finding container 99983bfea463edd729019652e34fc34fe200d94fcd1c051e427c3fc275bc74b2: Status 404 returned error can't find the container with id 99983bfea463edd729019652e34fc34fe200d94fcd1c051e427c3fc275bc74b2 Apr 22 17:53:40.568180 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.568022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:40.568334 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.568206 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.568334 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.568276 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.568255043 +0000 UTC m=+3.046778458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.670049 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.668994 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:40.670049 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.669160 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:40.670049 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.669179 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:40.670049 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.669193 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.670049 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:40.669249 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.66923033 +0000 UTC m=+3.147753747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.996419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.996322 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:39 +0000 UTC" deadline="2027-12-16 02:47:08.389577259 +0000 UTC" Apr 22 17:53:40.996419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:40.996368 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14456h53m27.393215984s" Apr 22 17:53:41.078746 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.078704 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" event={"ID":"49c92d77-2103-47f4-a13f-cf6f14fa5779","Type":"ContainerStarted","Data":"f2205b94c20a16cd97e91b20f8a558e9bd8f7764d6f82687ce2f28c1dcfaf0a8"} Apr 22 17:53:41.090417 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.090337 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgqcv" event={"ID":"8a12c58d-7667-4b16-8b5b-9fb5d4f10530","Type":"ContainerStarted","Data":"3efc08e1a47d109d423695bb19b6d9fb75ecde64f4981265d7f8d7fb4291eb81"} Apr 22 17:53:41.099837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.099770 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwg7z" event={"ID":"dc9de976-617d-407c-9074-f0ad44c2518d","Type":"ContainerStarted","Data":"6676ae84e108f3fea0037026271658814c47bad0c32e0de5bb6d4e130d962ec2"} Apr 22 17:53:41.107502 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.107397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ctdsd" event={"ID":"d41f25be-a0c4-4095-99e5-f6190accf5a8","Type":"ContainerStarted","Data":"66b1ac5c0f9887bab28fd1ceb4b09390db901a8f94670ae0bb252ce234886ce5"} Apr 22 17:53:41.116096 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.116064 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" event={"ID":"bcc515e3-814e-41a4-9bbe-dc0050efd02c","Type":"ContainerStarted","Data":"ba37e9b3d536dc3845b34325f166f8e7d395cdce8f8d72497ba0f01beed13161"} Apr 22 17:53:41.122330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.122210 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9mrk" event={"ID":"58e27bc8-4e3e-4655-9e83-8aed674d5e93","Type":"ContainerStarted","Data":"99983bfea463edd729019652e34fc34fe200d94fcd1c051e427c3fc275bc74b2"} Apr 22 17:53:41.131705 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.131675 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerStarted","Data":"a82b525dad292b091dc0a5db403112a824bd6ae78094681c2b9d5580c8967d56"} Apr 22 17:53:41.151290 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.151257 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"34c52504ae27b9c1195ea54ca2b73bce37eb729385c45e1152d590bbdfebc66c"} Apr 22 17:53:41.362327 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.362247 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:41.576900 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.576779 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:41.577090 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.576943 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:41.577090 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.577009 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:43.576989541 +0000 UTC m=+5.055512948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:41.677398 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.677320 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:41.677555 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.677533 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:41.677627 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.677576 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:41.677627 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.677590 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:41.677727 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:41.677716 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:43.677697628 +0000 UTC m=+5.156221044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:41.997536 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.997441 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:39 +0000 UTC" deadline="2028-02-07 17:25:05.357112446 +0000 UTC" Apr 22 17:53:41.997536 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:41.997521 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15743h31m23.359595973s" Apr 22 17:53:42.053526 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:42.053495 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:42.053702 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:42.053621 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:42.054114 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:42.054094 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:42.054221 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:42.054205 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:42.543037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:42.542991 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:43.594107 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:43.594069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:43.594548 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.594206 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:43.594548 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.594274 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:47.594254887 +0000 UTC m=+9.072778313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:43.695115 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:43.695075 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:43.695354 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.695236 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:43.695354 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.695268 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:43.695354 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.695282 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:43.695354 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:43.695345 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:47.695326738 +0000 UTC m=+9.173850154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:44.053800 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:44.053203 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:44.053800 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:44.053346 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:44.053800 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:44.053202 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:44.053800 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:44.053559 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:46.053170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:46.053136 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:46.053170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:46.053155 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:46.053679 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:46.053273 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:46.053679 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:46.053514 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:47.626784 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:47.626624 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:47.627246 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.626800 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:47.627246 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.626902 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:55.626881185 +0000 UTC m=+17.105404586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:47.727984 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:47.727847 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:47.728175 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.728037 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:47.728175 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.728062 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:47.728175 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.728079 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:47.728175 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:47.728162 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:55.728143354 +0000 UTC m=+17.206666762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:48.053758 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:48.053662 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:48.053932 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:48.053796 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:48.054084 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:48.054065 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:48.054175 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:48.054161 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:50.053324 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:50.053229 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:50.053324 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:50.053251 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:50.053785 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:50.053357 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:50.053785 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:50.053519 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:52.053708 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:52.053673 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:52.054273 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:52.053673 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:52.054273 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:52.053812 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:52.054273 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:52.053890 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:52.921539 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:52.921328 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5lc5s"] Apr 22 17:53:52.969401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:52.969366 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:52.969562 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:52.969449 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:53:53.067129 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.067083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-dbus\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.067559 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.067150 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-kubelet-config\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.067559 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.067198 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168561 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.168520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-kubelet-config\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.168584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.168635 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-dbus\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.168661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-kubelet-config\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168901 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:53.168743 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:53.168901 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.168802 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-dbus\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.168901 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:53.168813 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:53.668798414 +0000 UTC m=+15.147321814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:53.671480 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:53.671443 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:53.671678 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:53.671571 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:53.671678 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:53.671649 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:54.671627108 +0000 UTC m=+16.150150534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:54.053338 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:54.053259 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:54.053490 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:54.053265 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:54.053490 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:54.053362 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:54.053490 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:54.053443 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:54.679606 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:54.679569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:54.680051 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:54.679697 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:54.680051 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:54.679766 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.679745541 +0000 UTC m=+18.158268953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:55.053078 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.052990 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:55.053245 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.053132 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:53:55.456164 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.456084 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gglf5"] Apr 22 17:53:55.481337 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.481300 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.484069 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.484041 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:55.484227 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.484068 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cspk\"" Apr 22 17:53:55.484227 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.484139 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:55.586810 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.586774 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d8548-161a-40c6-aa4f-a43f0cb0ff07-hosts-file\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.586974 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.586856 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/494d8548-161a-40c6-aa4f-a43f0cb0ff07-tmp-dir\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.586974 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.586906 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzm7\" (UniqueName: \"kubernetes.io/projected/494d8548-161a-40c6-aa4f-a43f0cb0ff07-kube-api-access-6mzm7\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.688180 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d8548-161a-40c6-aa4f-a43f0cb0ff07-hosts-file\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.688180 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688194 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:55.688669 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688299 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d8548-161a-40c6-aa4f-a43f0cb0ff07-hosts-file\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.688669 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.688319 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:55.688669 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.688426 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.688405027 +0000 UTC m=+33.166928428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:55.688669 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688514 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/494d8548-161a-40c6-aa4f-a43f0cb0ff07-tmp-dir\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.688669 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688554 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzm7\" (UniqueName: \"kubernetes.io/projected/494d8548-161a-40c6-aa4f-a43f0cb0ff07-kube-api-access-6mzm7\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.688892 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.688852 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/494d8548-161a-40c6-aa4f-a43f0cb0ff07-tmp-dir\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.697276 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.697253 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzm7\" (UniqueName: \"kubernetes.io/projected/494d8548-161a-40c6-aa4f-a43f0cb0ff07-kube-api-access-6mzm7\") pod \"node-resolver-gglf5\" (UID: \"494d8548-161a-40c6-aa4f-a43f0cb0ff07\") " pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:55.789363 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.789235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:55.789490 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.789422 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:55.789490 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.789447 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:55.789490 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.789458 2564 projected.go:194] Error preparing data for projected volume kube-api-access-wplh4 for pod openshift-network-diagnostics/network-check-target-rx62v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:55.789612 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:55.789515 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4 podName:6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.789501393 +0000 UTC m=+33.268024799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wplh4" (UniqueName: "kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4") pod "network-check-target-rx62v" (UID: "6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:55.791259 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:55.791235 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gglf5" Apr 22 17:53:56.053250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:56.053170 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:56.053410 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:56.053170 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:56.053410 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:56.053298 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:56.053410 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:56.053363 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:56.695765 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:56.695727 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:56.696220 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:56.695898 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:56.696220 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:56.695969 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.695950926 +0000 UTC m=+22.174474332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:57.053459 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:57.053366 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:57.053620 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:57.053505 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:53:58.053078 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.053049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:53:58.053480 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.053049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:53:58.053480 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:58.053154 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:53:58.053480 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:58.053261 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:53:58.189449 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.188764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" event={"ID":"bcc515e3-814e-41a4-9bbe-dc0050efd02c","Type":"ContainerStarted","Data":"e2b61bc099c25b00b203f51060d7085fa79d0a73ba8254b618bfc325d5088e74"} Apr 22 17:53:58.191250 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.191198 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"1322ba161bdce060b09545f6991acd45e0fb3c7c0f9edd3b95b043d68400241a"} Apr 22 17:53:58.193165 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.192799 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gglf5" event={"ID":"494d8548-161a-40c6-aa4f-a43f0cb0ff07","Type":"ContainerStarted","Data":"fb49542627da8b4f26343d6e4ef0701f4495c4962096fe979c356abed345cf9f"} Apr 22 17:53:58.194188 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.194143 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" event={"ID":"fdf6ed0045bf81395f30896fa82f74ae","Type":"ContainerStarted","Data":"7fa8999d61efc67df31ff7939ac71f20a78e6fe3d3565d43a3e086b97d0482fe"} Apr 22 17:53:58.215947 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.215893 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tr9fb" podStartSLOduration=1.7870203930000002 podStartE2EDuration="19.215875998s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.292918655 +0000 UTC m=+1.771442059" lastFinishedPulling="2026-04-22 17:53:57.721774246 +0000 UTC m=+19.200297664" observedRunningTime="2026-04-22 17:53:58.21562561 +0000 UTC m=+19.694149045" watchObservedRunningTime="2026-04-22 17:53:58.215875998 +0000 UTC m=+19.694399413" Apr 22 17:53:58.229723 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:58.229671 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-11.ec2.internal" podStartSLOduration=19.229658624 podStartE2EDuration="19.229658624s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:58.229635109 +0000 UTC m=+19.708158529" watchObservedRunningTime="2026-04-22 17:53:58.229658624 +0000 UTC m=+19.708182046" Apr 22 17:53:59.054471 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.054297 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:53:59.055167 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:53:59.054535 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:53:59.196874 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.196832 2564 generic.go:358] "Generic (PLEG): container finished" podID="fc4d4c20e132e1722348f099c5424474" containerID="97d8d25d83c76ec9281fef88ef19abee3014147445e6f926fbb9b9854634b4dc" exitCode=0 Apr 22 17:53:59.197022 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.196927 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" event={"ID":"fc4d4c20e132e1722348f099c5424474","Type":"ContainerDied","Data":"97d8d25d83c76ec9281fef88ef19abee3014147445e6f926fbb9b9854634b4dc"} Apr 22 17:53:59.198760 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.198730 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d9mrk" event={"ID":"58e27bc8-4e3e-4655-9e83-8aed674d5e93","Type":"ContainerStarted","Data":"b13adfa6f74add911fe378f511ced0dcfdfd803ba5e1a03ed779f4e90fda5af9"} Apr 22 17:53:59.200243 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.200222 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="f3b93fb7eacd579e983167b6582e4f7482ec324bc8a7c9b977492c5d1a2581d8" exitCode=0 Apr 22 17:53:59.200306 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.200294 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"f3b93fb7eacd579e983167b6582e4f7482ec324bc8a7c9b977492c5d1a2581d8"} Apr 22 17:53:59.202833 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.202817 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:53:59.203140 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203122 2564 generic.go:358] "Generic (PLEG): container finished" podID="75123953-ef56-489a-8b07-e5d0a129fad3" containerID="26c90fbaeed6fd1d61ad333c1be29ea111eae203a5536188b4f4566fa68ebe5c" exitCode=1 Apr 22 17:53:59.203204 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203183 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerDied","Data":"26c90fbaeed6fd1d61ad333c1be29ea111eae203a5536188b4f4566fa68ebe5c"} Apr 22 17:53:59.203254 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203212 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"b453483ec976d9990ffcda2fce7304193ac0aed310322ef3a0428350b2b08728"} Apr 22 17:53:59.203254 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203222 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"0527d06389f9b0d0784c664bab36e0b55cabb307319ea6b63e91411fac33ef30"} Apr 22 17:53:59.203254 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"240ac0b6c9698df868f8541846e6c5bf3e5c97fefd0ec3d889a76c5f5d8da8cd"} Apr 22 17:53:59.203254 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.203239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"8ae17a2f8959e0bff89c96404b86552c00c6a8fbbec52111b260fb0704d82d36"} Apr 22 17:53:59.204520 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.204416 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gglf5" event={"ID":"494d8548-161a-40c6-aa4f-a43f0cb0ff07","Type":"ContainerStarted","Data":"abb6645259b0c5a543daa1bda296f782a354003d6f68a23674328f9d4e0f3302"} Apr 22 17:53:59.205767 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.205748 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" event={"ID":"49c92d77-2103-47f4-a13f-cf6f14fa5779","Type":"ContainerStarted","Data":"e1c7392ec4dbb0c086f33bfe5ca8a381ecdddce49d37e67a474275c02ed72105"} Apr 22 17:53:59.206993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.206963 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgqcv" event={"ID":"8a12c58d-7667-4b16-8b5b-9fb5d4f10530","Type":"ContainerStarted","Data":"ed959d301b4785063be681fac823459f3ad6f7efb2f63ec36691a8e97cc62eb6"} Apr 22 17:53:59.208206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.208186 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwg7z" event={"ID":"dc9de976-617d-407c-9074-f0ad44c2518d","Type":"ContainerStarted","Data":"2548e09bf4df2da61bbc3f9985f65fe6cc5da54a2c4b0afca6797a81cf2c4f90"} Apr 22 17:53:59.209655 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.209548 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ctdsd" event={"ID":"d41f25be-a0c4-4095-99e5-f6190accf5a8","Type":"ContainerStarted","Data":"17f5cfd84f14e2d9e146e427b4fcfd5806bb692326b8233d9a2b7e061e20f871"} Apr 22 17:53:59.221063 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.221023 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dwg7z" podStartSLOduration=2.835655451 podStartE2EDuration="20.221010918s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.332016268 +0000 UTC m=+1.810539672" lastFinishedPulling="2026-04-22 17:53:57.717371723 +0000 UTC m=+19.195895139" observedRunningTime="2026-04-22 17:53:59.220574472 +0000 UTC m=+20.699097899" watchObservedRunningTime="2026-04-22 17:53:59.221010918 +0000 UTC m=+20.699534341" Apr 22 17:53:59.250275 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.250220 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fgqcv" podStartSLOduration=2.870283514 podStartE2EDuration="20.250201417s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.337515678 +0000 UTC m=+1.816039082" lastFinishedPulling="2026-04-22 17:53:57.717433572 +0000 UTC m=+19.195956985" observedRunningTime="2026-04-22 17:53:59.233420425 +0000 UTC m=+20.711943878" watchObservedRunningTime="2026-04-22 17:53:59.250201417 +0000 UTC m=+20.728724842" Apr 22 17:53:59.260431 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.260380 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d9mrk" podStartSLOduration=2.890322094 podStartE2EDuration="20.260333641s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.347821833 +0000 UTC m=+1.826345234" lastFinishedPulling="2026-04-22 17:53:57.717833361 +0000 UTC m=+19.196356781" observedRunningTime="2026-04-22 17:53:59.260148343 +0000 UTC m=+20.738671770" watchObservedRunningTime="2026-04-22 17:53:59.260333641 +0000 UTC m=+20.738857066" Apr 22 17:53:59.273432 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.273337 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gglf5" podStartSLOduration=4.27332127 podStartE2EDuration="4.27332127s" podCreationTimestamp="2026-04-22 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:59.272475697 +0000 UTC m=+20.750999122" watchObservedRunningTime="2026-04-22 17:53:59.27332127 +0000 UTC m=+20.751844694" Apr 22 17:53:59.287747 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.287666 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ctdsd" podStartSLOduration=2.492944421 podStartE2EDuration="20.287646384s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.326921231 +0000 UTC m=+1.805444648" lastFinishedPulling="2026-04-22 17:53:58.121623211 +0000 UTC m=+19.600146611" observedRunningTime="2026-04-22 17:53:59.28701276 +0000 UTC m=+20.765536184" watchObservedRunningTime="2026-04-22 17:53:59.287646384 +0000 UTC m=+20.766169809" Apr 22 17:53:59.736055 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:53:59.736025 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:54:00.022037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.021848 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:59.736050446Z","UUID":"9f148205-8331-4b62-bfbd-c9cf650cfe66","Handler":null,"Name":"","Endpoint":""} Apr 22 17:54:00.023666 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.023642 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:54:00.023666 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.023670 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:54:00.053026 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.052995 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:00.053195 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.053035 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:00.053195 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:00.053114 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:00.053302 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:00.053237 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:00.213483 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.213438 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" event={"ID":"fc4d4c20e132e1722348f099c5424474","Type":"ContainerStarted","Data":"f0f3021064c8d0fc62ea62de138ed74bf327c74fe39e88f3d7f1b66ede5e8eb8"} Apr 22 17:54:00.215785 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.215756 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" event={"ID":"49c92d77-2103-47f4-a13f-cf6f14fa5779","Type":"ContainerStarted","Data":"e9ed2293ce312fd175035267357221f90cd72931f46cda1aa66a8f1adb34432a"} Apr 22 17:54:00.228352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.228295 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-11.ec2.internal" podStartSLOduration=21.228276286 podStartE2EDuration="21.228276286s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:54:00.228225254 +0000 UTC m=+21.706748678" watchObservedRunningTime="2026-04-22 17:54:00.228276286 +0000 UTC m=+21.706799714" Apr 22 17:54:00.269490 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.269452 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:54:00.270401 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.270377 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:54:00.723330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:00.723296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:00.723629 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:00.723453 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:00.723629 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:00.723504 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:08.723491939 +0000 UTC m=+30.202015340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:01.053814 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.053772 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:01.054013 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:01.053913 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:01.220303 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.220278 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:54:01.220723 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.220663 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"6e04df7a0f4070834be13d9b745781354246475f02c7e1fbf77dde7788818e76"} Apr 22 17:54:01.222720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.222677 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" event={"ID":"49c92d77-2103-47f4-a13f-cf6f14fa5779","Type":"ContainerStarted","Data":"a2b4167e20ac3114273c16916a40f810255d6f55c6fe06695639961d007274bf"} Apr 22 17:54:01.222999 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.222963 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:54:01.223448 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.223426 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fgqcv" Apr 22 17:54:01.249765 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:01.249708 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s8bbw" podStartSLOduration=2.114201822 podStartE2EDuration="22.249689703s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.342125111 +0000 UTC m=+1.820648524" lastFinishedPulling="2026-04-22 17:54:00.477612998 +0000 UTC m=+21.956136405" observedRunningTime="2026-04-22 17:54:01.249507182 +0000 UTC m=+22.728030604" watchObservedRunningTime="2026-04-22 17:54:01.249689703 +0000 UTC m=+22.728213128" Apr 22 17:54:02.052922 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:02.052887 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:02.053116 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:02.052887 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:02.053116 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:02.053000 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:02.053116 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:02.053101 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:03.053575 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:03.053481 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:03.054087 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:03.053622 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:04.053854 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.053619 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:04.053854 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.053619 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:04.054616 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:04.053901 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:04.054616 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:04.053976 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:04.229801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.229767 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="cf84fd1e7cd335976158eb51aefa5c1293d4cdafd4feef569c410884fc516191" exitCode=0 Apr 22 17:54:04.230006 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.229837 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"cf84fd1e7cd335976158eb51aefa5c1293d4cdafd4feef569c410884fc516191"} Apr 22 17:54:04.232897 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.232879 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:54:04.233251 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.233219 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"d1cd76659c0aa3b4766426d0528f11010aa08d06aeeb9712e9f1ea34e2f3e915"} Apr 22 17:54:04.233569 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.233548 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:04.233622 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.233575 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:04.233723 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.233680 2564 scope.go:117] "RemoveContainer" containerID="26c90fbaeed6fd1d61ad333c1be29ea111eae203a5536188b4f4566fa68ebe5c" Apr 22 17:54:04.248352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:04.248331 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:05.053182 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.053147 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:05.053312 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:05.053288 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:05.236922 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.236822 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="f5f0df36d830a65ddab688aead9699c8bcdceee40c41661f62ca4cf5279851e8" exitCode=0 Apr 22 17:54:05.237358 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.236914 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"f5f0df36d830a65ddab688aead9699c8bcdceee40c41661f62ca4cf5279851e8"} Apr 22 17:54:05.241410 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.241388 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:54:05.241765 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.241734 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" event={"ID":"75123953-ef56-489a-8b07-e5d0a129fad3","Type":"ContainerStarted","Data":"2bb1fe124113afa01cdc10cec43b7daaa868fc9b0dfe59008529c0244ccce685"} Apr 22 17:54:05.242245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.242224 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:05.256752 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.256725 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:05.293935 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.293886 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" podStartSLOduration=8.81449296 podStartE2EDuration="26.293853933s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.281269108 +0000 UTC m=+1.759792512" lastFinishedPulling="2026-04-22 17:53:57.76063007 +0000 UTC m=+19.239153485" observedRunningTime="2026-04-22 17:54:05.292419471 +0000 UTC m=+26.770942896" watchObservedRunningTime="2026-04-22 17:54:05.293853933 +0000 UTC m=+26.772377363" Apr 22 17:54:05.431029 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.430996 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5lc5s"] Apr 22 17:54:05.431181 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.431132 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:05.431252 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:05.431232 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:05.434445 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.434420 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wcgxk"] Apr 22 17:54:05.434586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.434512 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:05.434644 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:05.434602 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:05.437057 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.437031 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rx62v"] Apr 22 17:54:05.437144 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:05.437119 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:05.437209 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:05.437191 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:06.245405 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:06.245320 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="b089364e1f3993f79e13f19acc14de9078b46adc0c4596fa35daeb70115fbe77" exitCode=0 Apr 22 17:54:06.245405 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:06.245351 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"b089364e1f3993f79e13f19acc14de9078b46adc0c4596fa35daeb70115fbe77"} Apr 22 17:54:07.053748 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:07.053702 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:07.053893 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:07.053846 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:07.053893 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:07.053852 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:07.053999 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:07.053976 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:07.054045 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:07.053702 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:07.054092 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:07.054081 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:08.786509 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:08.786465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:08.786897 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:08.786615 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:08.786897 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:08.786677 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret podName:6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:24.786659928 +0000 UTC m=+46.265183339 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret") pod "global-pull-secret-syncer-5lc5s" (UID: "6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:09.053976 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.053886 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:09.053976 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.053927 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:09.054173 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:09.054011 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5lc5s" podUID="6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7" Apr 22 17:54:09.054173 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.054028 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:09.054173 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:09.054155 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:54:09.054329 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:09.054260 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rx62v" podUID="6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c" Apr 22 17:54:09.826102 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.825912 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-11.ec2.internal" event="NodeReady" Apr 22 17:54:09.826486 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.826231 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:54:09.862296 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.862261 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mhbld"] Apr 22 17:54:09.882164 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.882131 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c57866bf4-cx2k4"] Apr 22 17:54:09.882318 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.882267 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:09.885782 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.885104 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:54:09.885782 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.885418 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:54:09.885782 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.885628 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-q68l9\"" Apr 22 17:54:09.899910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.899883 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sgmcw"] Apr 22 17:54:09.900186 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.900165 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.903156 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.903133 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:54:09.903251 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.903177 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wcdv8\"" Apr 22 17:54:09.903251 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.903159 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:54:09.903465 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.903424 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:54:09.909350 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.909331 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:54:09.917221 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.917201 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mhbld"] Apr 22 17:54:09.917324 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.917226 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c57866bf4-cx2k4"] Apr 22 17:54:09.917380 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.917326 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bgpms"] Apr 22 17:54:09.917380 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.917336 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:09.919916 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.919894 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:54:09.920026 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.919941 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:54:09.920026 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.919956 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:54:09.945811 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.945749 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sgmcw"] Apr 22 17:54:09.945811 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.945781 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bgpms"] Apr 22 17:54:09.946068 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.945925 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:09.949124 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.949102 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:54:09.949269 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.949123 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:54:09.949269 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.949109 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:54:09.949486 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.949438 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:54:09.994759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994675 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:09.994759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.994759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994752 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/da61e571-00a2-4ad8-86ad-1156286a7409-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994777 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994803 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994891 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2qc\" (UniqueName: \"kubernetes.io/projected/4b38c080-af4d-4d73-ad1f-c364849e7212-kube-api-access-pb2qc\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994922 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994945 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtpb\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.994994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.994974 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b38c080-af4d-4d73-ad1f-c364849e7212-tmp-dir\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:09.995245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.995015 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b38c080-af4d-4d73-ad1f-c364849e7212-config-volume\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:09.995245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.995082 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:09.995245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.995112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.995245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.995132 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:09.995245 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:09.995148 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.095561 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095517 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqb8k\" (UniqueName: \"kubernetes.io/projected/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-kube-api-access-pqb8k\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095571 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2qc\" (UniqueName: \"kubernetes.io/projected/4b38c080-af4d-4d73-ad1f-c364849e7212-kube-api-access-pb2qc\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095608 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtpb\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b38c080-af4d-4d73-ad1f-c364849e7212-tmp-dir\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095679 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b38c080-af4d-4d73-ad1f-c364849e7212-config-volume\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.095720 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095713 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095739 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095766 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095813 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095847 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095888 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.095897 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/da61e571-00a2-4ad8-86ad-1156286a7409-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.095961 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.595941734 +0000 UTC m=+32.074465151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.095985 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.096065 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:10.096081 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.096084 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.096131 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.596114655 +0000 UTC m=+32.074638061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.096157 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.096278 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b38c080-af4d-4d73-ad1f-c364849e7212-tmp-dir\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.096328 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.096471 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:10.096611 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.096520 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.596503085 +0000 UTC m=+32.075026499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:10.096949 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.096750 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/da61e571-00a2-4ad8-86ad-1156286a7409-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:10.097695 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.097668 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.098548 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.098344 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.101317 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.101248 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.101435 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.101333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.104625 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.104598 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtpb\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.105383 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.105364 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.112116 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.112090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b38c080-af4d-4d73-ad1f-c364849e7212-config-volume\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.114115 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.114090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2qc\" (UniqueName: \"kubernetes.io/projected/4b38c080-af4d-4d73-ad1f-c364849e7212-kube-api-access-pb2qc\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.196851 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.196805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.197031 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.196942 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqb8k\" (UniqueName: \"kubernetes.io/projected/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-kube-api-access-pqb8k\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.197031 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.196997 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:10.197101 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.197077 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.697057158 +0000 UTC m=+32.175580565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:10.206588 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.206555 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqb8k\" (UniqueName: \"kubernetes.io/projected/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-kube-api-access-pqb8k\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.600960 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.600921 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:10.600960 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.600971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.601015 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601087 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601107 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601156 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.601140967 +0000 UTC m=+33.079664370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601155 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601171 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.601164876 +0000 UTC m=+33.079688276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:10.601211 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601176 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:10.601518 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.601253 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.601233127 +0000 UTC m=+33.079756550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:10.702363 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:10.702320 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:10.702550 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.702449 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:10.702550 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:10.702522 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.702503956 +0000 UTC m=+33.181027359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:11.053751 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.053667 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:11.053751 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.053720 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:11.054399 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.053667 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:11.058131 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058106 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:11.058263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058146 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcf4z\"" Apr 22 17:54:11.058263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058116 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:54:11.058263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058238 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:11.058549 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058535 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:54:11.058832 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.058813 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:11.611931 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.611894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:11.612154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.612017 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:11.612154 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612046 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:11.612154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.612066 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:11.612154 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612130 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.612110951 +0000 UTC m=+35.090634352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:11.612361 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612165 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:11.612361 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612187 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:11.612361 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612202 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:11.612361 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612237 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.612215356 +0000 UTC m=+35.090738798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:11.612361 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.612275 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.612259845 +0000 UTC m=+35.090783260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:11.712810 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.712773 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:11.713018 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.712824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:11.713018 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.712937 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:11.713018 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.713007 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:43.712989494 +0000 UTC m=+65.191512915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : secret "metrics-daemon-secret" not found Apr 22 17:54:11.713018 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.712938 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:11.713210 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:11.713079 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.713060954 +0000 UTC m=+35.191584364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:11.814144 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.814110 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:11.816668 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.816645 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplh4\" (UniqueName: \"kubernetes.io/projected/6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c-kube-api-access-wplh4\") pod \"network-check-target-rx62v\" (UID: \"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c\") " pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:11.974040 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:11.973961 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:12.118879 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:12.118833 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rx62v"] Apr 22 17:54:12.161235 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:54:12.161201 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d8e2e5c_104e_4adf_8584_d1f41c9d3b9c.slice/crio-a11e935bbd62a4c3868c9b2d9e3101d4de83abe5963ea01a5b3c6213ea65d267 WatchSource:0}: Error finding container a11e935bbd62a4c3868c9b2d9e3101d4de83abe5963ea01a5b3c6213ea65d267: Status 404 returned error can't find the container with id a11e935bbd62a4c3868c9b2d9e3101d4de83abe5963ea01a5b3c6213ea65d267 Apr 22 17:54:12.259238 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:12.259199 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rx62v" event={"ID":"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c","Type":"ContainerStarted","Data":"a11e935bbd62a4c3868c9b2d9e3101d4de83abe5963ea01a5b3c6213ea65d267"} Apr 22 17:54:13.264504 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.264299 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="e8f2513279e4b82fa17eba04bc1ed6e06b213e3036cb43d51baecfd933aec831" exitCode=0 Apr 22 17:54:13.265085 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.264375 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"e8f2513279e4b82fa17eba04bc1ed6e06b213e3036cb43d51baecfd933aec831"} Apr 22 17:54:13.629286 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.629249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:13.629455 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.629351 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:13.629455 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.629393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:13.629455 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629404 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:13.629569 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629491 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:17.629473631 +0000 UTC m=+39.107997050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:13.629569 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629497 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:13.629569 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629513 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:13.629569 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629555 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:17.629539583 +0000 UTC m=+39.108062984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:13.629569 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629497 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:13.629755 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.629585 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:17.629578965 +0000 UTC m=+39.108102366 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:13.730406 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:13.730346 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:13.730595 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.730473 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:13.730595 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:13.730550 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:17.730529954 +0000 UTC m=+39.209053356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:14.270503 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:14.270464 2564 generic.go:358] "Generic (PLEG): container finished" podID="b86c7b56-b95e-4b34-8a02-a7cbb80decae" containerID="ac53152fd10ba6892d66b51194859ec3cfb9c75b8592eecb89d2e1151e2aa4a7" exitCode=0 Apr 22 17:54:14.270941 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:14.270506 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerDied","Data":"ac53152fd10ba6892d66b51194859ec3cfb9c75b8592eecb89d2e1151e2aa4a7"} Apr 22 17:54:15.274901 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:15.274851 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" event={"ID":"b86c7b56-b95e-4b34-8a02-a7cbb80decae","Type":"ContainerStarted","Data":"3b01e05117fd7c73d1c1a119f8617ea647b14e7c6c58c7726f6e569e0f78f77e"} Apr 22 17:54:15.299597 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:15.299552 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rdt5n" podStartSLOduration=4.421022439 podStartE2EDuration="36.299538138s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:40.321941205 +0000 UTC m=+1.800464610" lastFinishedPulling="2026-04-22 17:54:12.200456907 +0000 UTC m=+33.678980309" observedRunningTime="2026-04-22 17:54:15.298514406 +0000 UTC m=+36.777037829" watchObservedRunningTime="2026-04-22 17:54:15.299538138 +0000 UTC m=+36.778061543" Apr 22 17:54:16.278511 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:16.278472 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rx62v" event={"ID":"6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c","Type":"ContainerStarted","Data":"d5c74a2f1fd25e1bcb56de4a2fce5646bfe8db379b2ebccfd3622937e261b70e"} Apr 22 17:54:16.279011 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:16.278892 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:54:16.311426 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:16.311373 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rx62v" podStartSLOduration=34.300836978 podStartE2EDuration="37.311358982s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.178826855 +0000 UTC m=+33.657350255" lastFinishedPulling="2026-04-22 17:54:15.189348853 +0000 UTC m=+36.667872259" observedRunningTime="2026-04-22 17:54:16.310335925 +0000 UTC m=+37.788859338" watchObservedRunningTime="2026-04-22 17:54:16.311358982 +0000 UTC m=+37.789882432" Apr 22 17:54:17.664144 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:17.664109 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:17.664153 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664237 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664252 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:17.664284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664297 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:25.664283774 +0000 UTC m=+47.142807174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664238 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664348 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:25.664336421 +0000 UTC m=+47.142859821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664380 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:17.664521 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.664412 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:25.66440308 +0000 UTC m=+47.142926481 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:17.765627 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:17.765589 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:17.765790 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.765740 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:17.765830 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:17.765803 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:25.765787152 +0000 UTC m=+47.244310561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:24.817271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:24.817222 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:24.821127 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:24.821104 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7-original-pull-secret\") pod \"global-pull-secret-syncer-5lc5s\" (UID: \"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7\") " pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:24.880281 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:24.880245 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5lc5s" Apr 22 17:54:24.991441 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:24.991407 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5lc5s"] Apr 22 17:54:25.298681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:25.298646 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5lc5s" event={"ID":"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7","Type":"ContainerStarted","Data":"c11d2914bc2664f254bf6526ac56c730dfd2b8409356893d62a8dc7c10665977"} Apr 22 17:54:25.724662 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:25.724614 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:25.724851 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:25.724676 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:25.724851 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724789 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:25.724851 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724805 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:25.724851 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724789 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:25.725117 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:25.724841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:25.725117 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724898 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.724849323 +0000 UTC m=+63.203372730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:25.725117 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724937 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:25.725117 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.724973 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.724949538 +0000 UTC m=+63.203472954 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:25.725117 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.725015 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.725001145 +0000 UTC m=+63.203524550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:25.826053 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:25.826014 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:25.826477 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.826177 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:25.826477 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:25.826244 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.826229148 +0000 UTC m=+63.304752548 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:30.309230 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:30.309187 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5lc5s" event={"ID":"6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7","Type":"ContainerStarted","Data":"e15defc491099b70c4616c94f3410d51e8fdfe5145715e583c52cdb17f5a7d87"} Apr 22 17:54:30.325328 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:30.325283 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5lc5s" podStartSLOduration=34.058032011 podStartE2EDuration="38.325267648s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:54:24.996265959 +0000 UTC m=+46.474789363" lastFinishedPulling="2026-04-22 17:54:29.2635016 +0000 UTC m=+50.742025000" observedRunningTime="2026-04-22 17:54:30.324398456 +0000 UTC m=+51.802921879" watchObservedRunningTime="2026-04-22 17:54:30.325267648 +0000 UTC m=+51.803791071" Apr 22 17:54:37.258211 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:37.258180 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zxzp" Apr 22 17:54:41.742469 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:41.742428 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:54:41.742895 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:41.742492 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:54:41.742895 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.742696 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:41.743000 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.742888 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:54:41.743000 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.742972 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:41.743087 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:41.742534 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:54:41.743270 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.743246 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:54:41.743473 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.743455 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:13.743422927 +0000 UTC m=+95.221946351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:54:41.743757 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.743739 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:55:13.743708933 +0000 UTC m=+95.222232341 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:54:41.743906 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.743890 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:13.743855321 +0000 UTC m=+95.222378728 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:54:41.844546 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:41.844510 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:54:41.844727 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.844599 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:41.844727 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:41.844663 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:55:13.844647482 +0000 UTC m=+95.323170884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:54:43.757386 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:43.757350 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:54:43.757785 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:43.757477 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:43.757785 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:54:43.757527 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:47.757513935 +0000 UTC m=+129.236037336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : secret "metrics-daemon-secret" not found Apr 22 17:54:48.284894 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:54:48.284843 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rx62v" Apr 22 17:55:13.765311 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:55:13.765271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:55:13.765311 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:55:13.765318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:55:13.765347 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765426 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765430 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765452 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765482 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.765469535 +0000 UTC m=+159.243992949 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765503 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.765489021 +0000 UTC m=+159.244012422 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765428 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:13.765818 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.765530 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.765523835 +0000 UTC m=+159.244047236 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:55:13.866299 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:55:13.866257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:55:13.866447 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.866401 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:13.866484 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:13.866466 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.866451009 +0000 UTC m=+159.344974410 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:55:47.807380 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:55:47.807342 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:55:47.807848 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:47.807492 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:55:47.807848 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:55:47.807564 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs podName:2c1e2467-3796-47ad-928c-f82f435261e9 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:49.807544368 +0000 UTC m=+251.286067772 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs") pod "network-metrics-daemon-wcgxk" (UID: "2c1e2467-3796-47ad-928c-f82f435261e9") : secret "metrics-daemon-secret" not found Apr 22 17:56:10.038118 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.038086 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g6jpq"] Apr 22 17:56:10.040986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.040957 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc"] Apr 22 17:56:10.041123 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.041105 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.043428 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043411 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-768bdd6654-j6n7q"] Apr 22 17:56:10.043616 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043547 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.043815 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043738 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:56:10.043938 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043840 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:56:10.043938 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043906 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:56:10.043938 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.043915 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:56:10.044686 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.044668 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-qtcvz\"" Apr 22 17:56:10.046294 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.046257 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.047271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.047252 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tjk72\"" Apr 22 17:56:10.047501 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.047482 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:56:10.047605 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.047499 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:56:10.047605 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.047490 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:56:10.047689 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.047669 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:56:10.048956 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.048937 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:56:10.049032 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.048954 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:56:10.049032 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.048983 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:56:10.049770 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.049751 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:56:10.049945 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.049834 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:56:10.049945 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.049849 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:56:10.050092 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.050007 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hbkwr\"" Apr 22 17:56:10.055908 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.055885 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:56:10.056935 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.056912 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc"] Apr 22 17:56:10.058771 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.058749 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g6jpq"] Apr 22 17:56:10.059492 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.059471 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-768bdd6654-j6n7q"] Apr 22 17:56:10.066221 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066202 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.066305 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.066305 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066244 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bb31bacb-bc3d-4ab6-93ea-80941f907553-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.066305 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066282 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-tmp\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.066454 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066314 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-serving-cert\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.066454 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066342 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.066454 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066364 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-stats-auth\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.066454 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066419 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.066650 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066494 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-service-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.066650 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066544 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprh9\" (UniqueName: \"kubernetes.io/projected/bb31bacb-bc3d-4ab6-93ea-80941f907553-kube-api-access-lprh9\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.066650 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066578 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-default-certificate\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.066650 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066603 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg8ss\" (UniqueName: \"kubernetes.io/projected/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-kube-api-access-sg8ss\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.066650 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066640 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkcl\" (UniqueName: \"kubernetes.io/projected/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-kube-api-access-sjkcl\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.066805 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.066692 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-snapshots\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.167474 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167442 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-snapshots\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.167474 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167474 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167495 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167522 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bb31bacb-bc3d-4ab6-93ea-80941f907553-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167549 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-tmp\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-serving-cert\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.167603 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167617 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167642 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-stats-auth\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.167679 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:10.66765753 +0000 UTC m=+152.146180946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:10.167726 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167711 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167774 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-service-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167815 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lprh9\" (UniqueName: \"kubernetes.io/projected/bb31bacb-bc3d-4ab6-93ea-80941f907553-kube-api-access-lprh9\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-default-certificate\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.167887 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:10.667845206 +0000 UTC m=+152.146368623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg8ss\" (UniqueName: \"kubernetes.io/projected/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-kube-api-access-sg8ss\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.167978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkcl\" (UniqueName: \"kubernetes.io/projected/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-kube-api-access-sjkcl\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.168068 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-tmp\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.168205 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.168196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-snapshots\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.168609 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.168406 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:10.168609 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.168452 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:10.668438584 +0000 UTC m=+152.146961984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:10.168609 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.168450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bb31bacb-bc3d-4ab6-93ea-80941f907553-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.168609 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.168568 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-service-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.168742 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.168678 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.170206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.170187 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-default-certificate\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.170308 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.170291 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-stats-auth\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.170513 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.170494 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-serving-cert\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.178348 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.178323 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg8ss\" (UniqueName: \"kubernetes.io/projected/fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35-kube-api-access-sg8ss\") pod \"insights-operator-585dfdc468-g6jpq\" (UID: \"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35\") " pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.178731 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.178700 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprh9\" (UniqueName: \"kubernetes.io/projected/bb31bacb-bc3d-4ab6-93ea-80941f907553-kube-api-access-lprh9\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.178731 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.178722 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkcl\" (UniqueName: \"kubernetes.io/projected/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-kube-api-access-sjkcl\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.354345 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.354319 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" Apr 22 17:56:10.466388 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.466357 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g6jpq"] Apr 22 17:56:10.469663 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:10.469636 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa95ea66_4b22_4d17_b4f8_4c7e83ef4d35.slice/crio-bcabe47fd02cd8732674abdf51fb825afdec4989df1e854faa0ec61cbeea58cd WatchSource:0}: Error finding container bcabe47fd02cd8732674abdf51fb825afdec4989df1e854faa0ec61cbeea58cd: Status 404 returned error can't find the container with id bcabe47fd02cd8732674abdf51fb825afdec4989df1e854faa0ec61cbeea58cd Apr 22 17:56:10.504443 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.504411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" event={"ID":"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35","Type":"ContainerStarted","Data":"bcabe47fd02cd8732674abdf51fb825afdec4989df1e854faa0ec61cbeea58cd"} Apr 22 17:56:10.670910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.670818 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.670910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.670883 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:10.670910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:10.670910 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:10.671140 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.670968 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:10.671140 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.670976 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:10.671140 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.671037 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:11.67102084 +0000 UTC m=+153.149544240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:10.671255 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.671154 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:11.671140806 +0000 UTC m=+153.149664207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:10.671255 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:10.671167 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:11.671160855 +0000 UTC m=+153.149684256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:11.679932 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:11.679892 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:11.679975 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:11.680018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:11.680052 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:11.680117 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:13.68009779 +0000 UTC m=+155.158621191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:11.680150 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:13.680131735 +0000 UTC m=+155.158655152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:11.680172 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:11.680344 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:11.680222 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:13.680208907 +0000 UTC m=+155.158732315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:12.510084 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:12.509993 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" event={"ID":"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35","Type":"ContainerStarted","Data":"5a4f44f160a74f12ce9da463423db1d4a6c49326df29beac74902cf511fe756f"} Apr 22 17:56:12.529346 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:12.529304 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" podStartSLOduration=0.858644011 podStartE2EDuration="2.529290187s" podCreationTimestamp="2026-04-22 17:56:10 +0000 UTC" firstStartedPulling="2026-04-22 17:56:10.471344864 +0000 UTC m=+151.949868269" lastFinishedPulling="2026-04-22 17:56:12.141991041 +0000 UTC m=+153.620514445" observedRunningTime="2026-04-22 17:56:12.528638168 +0000 UTC m=+154.007161591" watchObservedRunningTime="2026-04-22 17:56:12.529290187 +0000 UTC m=+154.007813610" Apr 22 17:56:12.895400 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:12.895352 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" podUID="da61e571-00a2-4ad8-86ad-1156286a7409" Apr 22 17:56:12.911541 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:12.911507 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" podUID="966c0818-a3ff-4930-86fc-27b6ab381b1a" Apr 22 17:56:12.927679 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:12.927648 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sgmcw" podUID="4b38c080-af4d-4d73-ad1f-c364849e7212" Apr 22 17:56:12.956976 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:12.956936 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bgpms" podUID="341b2cf5-e5b2-4950-98bb-f85daf6a0a5f" Apr 22 17:56:13.511620 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.511587 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:56:13.511792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.511587 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:56:13.511792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.511589 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:56:13.697615 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.697577 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:13.697775 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.697626 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:13.697775 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:13.697663 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:13.697775 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:13.697704 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:13.697775 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:13.697766 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.697750943 +0000 UTC m=+159.176274343 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:13.697965 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:13.697780 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:13.697965 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:13.697802 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.697785657 +0000 UTC m=+159.176309092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:13.697965 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:13.697823 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.697813548 +0000 UTC m=+159.176336951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:14.067054 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:14.067015 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wcgxk" podUID="2c1e2467-3796-47ad-928c-f82f435261e9" Apr 22 17:56:15.266026 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:15.265947 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gglf5_494d8548-161a-40c6-aa4f-a43f0cb0ff07/dns-node-resolver/0.log" Apr 22 17:56:15.868410 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:15.868383 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d9mrk_58e27bc8-4e3e-4655-9e83-8aed674d5e93/node-ca/0.log" Apr 22 17:56:17.727094 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.727054 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.727109 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.727221 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:25.727199017 +0000 UTC m=+167.205722426 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.727244 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.727253 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.727325 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:25.7273084 +0000 UTC m=+167.205831806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.727350 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:17.727545 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.727397 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:25.727382956 +0000 UTC m=+167.205906360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:17.828501 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.828466 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.828513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") pod \"image-registry-5c57866bf4-cx2k4\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.828532 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828635 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828663 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c57866bf4-cx2k4: secret "image-registry-tls" not found Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828685 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:56:17.828704 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828708 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls podName:966c0818-a3ff-4930-86fc-27b6ab381b1a nodeName:}" failed. No retries permitted until 2026-04-22 17:58:19.828694024 +0000 UTC m=+281.307217425 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls") pod "image-registry-5c57866bf4-cx2k4" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a") : secret "image-registry-tls" not found Apr 22 17:56:17.828953 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828726 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert podName:da61e571-00a2-4ad8-86ad-1156286a7409 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:19.828714853 +0000 UTC m=+281.307238258 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mhbld" (UID: "da61e571-00a2-4ad8-86ad-1156286a7409") : secret "networking-console-plugin-cert" not found Apr 22 17:56:17.828953 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828635 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:56:17.828953 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.828772 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls podName:4b38c080-af4d-4d73-ad1f-c364849e7212 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:19.828760491 +0000 UTC m=+281.307283891 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls") pod "dns-default-sgmcw" (UID: "4b38c080-af4d-4d73-ad1f-c364849e7212") : secret "dns-default-metrics-tls" not found Apr 22 17:56:17.929321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:17.929287 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:56:17.929466 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.929393 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:56:17.929466 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:17.929439 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert podName:341b2cf5-e5b2-4950-98bb-f85daf6a0a5f nodeName:}" failed. No retries permitted until 2026-04-22 17:58:19.929424952 +0000 UTC m=+281.407948353 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert") pod "ingress-canary-bgpms" (UID: "341b2cf5-e5b2-4950-98bb-f85daf6a0a5f") : secret "canary-serving-cert" not found Apr 22 17:56:19.020037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.020004 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98"] Apr 22 17:56:19.022848 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.022830 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.025429 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.025396 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:19.025429 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.025396 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:56:19.025761 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.025743 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:56:19.026522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.026503 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dq4kt\"" Apr 22 17:56:19.026592 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.026511 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:56:19.031450 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.031429 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98"] Apr 22 17:56:19.141330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.141286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqhqj\" (UniqueName: \"kubernetes.io/projected/ec52f02a-c723-4c97-a74b-1348c7d84b33-kube-api-access-qqhqj\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.141521 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.141350 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52f02a-c723-4c97-a74b-1348c7d84b33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.141521 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.141466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52f02a-c723-4c97-a74b-1348c7d84b33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.242544 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.242508 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52f02a-c723-4c97-a74b-1348c7d84b33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.242684 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.242648 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52f02a-c723-4c97-a74b-1348c7d84b33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.242760 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.242696 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqhqj\" (UniqueName: \"kubernetes.io/projected/ec52f02a-c723-4c97-a74b-1348c7d84b33-kube-api-access-qqhqj\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.243062 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.243037 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52f02a-c723-4c97-a74b-1348c7d84b33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.244825 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.244804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52f02a-c723-4c97-a74b-1348c7d84b33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.250961 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.250939 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqhqj\" (UniqueName: \"kubernetes.io/projected/ec52f02a-c723-4c97-a74b-1348c7d84b33-kube-api-access-qqhqj\") pod \"kube-storage-version-migrator-operator-6769c5d45-jpj98\" (UID: \"ec52f02a-c723-4c97-a74b-1348c7d84b33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.332146 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.332080 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" Apr 22 17:56:19.449271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.449240 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98"] Apr 22 17:56:19.452308 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:19.452280 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec52f02a_c723_4c97_a74b_1348c7d84b33.slice/crio-9b2900bfc20878c0706c74e1be228de21b6ba4a069ac04d967a72e75ca2336e6 WatchSource:0}: Error finding container 9b2900bfc20878c0706c74e1be228de21b6ba4a069ac04d967a72e75ca2336e6: Status 404 returned error can't find the container with id 9b2900bfc20878c0706c74e1be228de21b6ba4a069ac04d967a72e75ca2336e6 Apr 22 17:56:19.523507 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:19.523475 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" event={"ID":"ec52f02a-c723-4c97-a74b-1348c7d84b33","Type":"ContainerStarted","Data":"9b2900bfc20878c0706c74e1be228de21b6ba4a069ac04d967a72e75ca2336e6"} Apr 22 17:56:20.020224 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.020191 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65"] Apr 22 17:56:20.023171 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.023148 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.025994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.025964 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 17:56:20.026122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.025999 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:20.026122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.026009 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-s7vp6\"" Apr 22 17:56:20.026122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.026018 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 17:56:20.026973 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.026953 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 17:56:20.034221 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.034198 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65"] Apr 22 17:56:20.151237 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.151205 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c992423-78aa-47f2-9542-6fc88fefda4b-config\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.151420 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.151395 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgdm5\" (UniqueName: \"kubernetes.io/projected/4c992423-78aa-47f2-9542-6fc88fefda4b-kube-api-access-lgdm5\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.151478 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.151434 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c992423-78aa-47f2-9542-6fc88fefda4b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.252229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.252142 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgdm5\" (UniqueName: \"kubernetes.io/projected/4c992423-78aa-47f2-9542-6fc88fefda4b-kube-api-access-lgdm5\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.252229 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.252189 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c992423-78aa-47f2-9542-6fc88fefda4b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.252446 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.252253 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c992423-78aa-47f2-9542-6fc88fefda4b-config\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.252874 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.252827 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c992423-78aa-47f2-9542-6fc88fefda4b-config\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.254857 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.254822 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c992423-78aa-47f2-9542-6fc88fefda4b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.262047 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.262020 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgdm5\" (UniqueName: \"kubernetes.io/projected/4c992423-78aa-47f2-9542-6fc88fefda4b-kube-api-access-lgdm5\") pod \"service-ca-operator-d6fc45fc5-cnm65\" (UID: \"4c992423-78aa-47f2-9542-6fc88fefda4b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.333459 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.333379 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" Apr 22 17:56:20.466662 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.466631 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65"] Apr 22 17:56:20.469655 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:20.469623 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c992423_78aa_47f2_9542_6fc88fefda4b.slice/crio-77cfeb2f08571de1297711006d3101afe28e275d47aa990b6de583fb3a1580fa WatchSource:0}: Error finding container 77cfeb2f08571de1297711006d3101afe28e275d47aa990b6de583fb3a1580fa: Status 404 returned error can't find the container with id 77cfeb2f08571de1297711006d3101afe28e275d47aa990b6de583fb3a1580fa Apr 22 17:56:20.526829 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:20.526795 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" event={"ID":"4c992423-78aa-47f2-9542-6fc88fefda4b","Type":"ContainerStarted","Data":"77cfeb2f08571de1297711006d3101afe28e275d47aa990b6de583fb3a1580fa"} Apr 22 17:56:21.530036 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:21.529997 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" event={"ID":"ec52f02a-c723-4c97-a74b-1348c7d84b33","Type":"ContainerStarted","Data":"7f8cb1632cd6afa792da346b07df0d89b35f833dee67f64e2e40f0f6eaa948ed"} Apr 22 17:56:21.548770 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:21.548711 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" podStartSLOduration=0.565373734 podStartE2EDuration="2.548697825s" podCreationTimestamp="2026-04-22 17:56:19 +0000 UTC" firstStartedPulling="2026-04-22 17:56:19.454017202 +0000 UTC m=+160.932540606" lastFinishedPulling="2026-04-22 17:56:21.437341289 +0000 UTC m=+162.915864697" observedRunningTime="2026-04-22 17:56:21.548063493 +0000 UTC m=+163.026586937" watchObservedRunningTime="2026-04-22 17:56:21.548697825 +0000 UTC m=+163.027221247" Apr 22 17:56:22.533913 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.533795 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" event={"ID":"4c992423-78aa-47f2-9542-6fc88fefda4b","Type":"ContainerStarted","Data":"106d0527e082ba9a55399e9d4de109e3e590f712068ff0fac4f6f2a717cadd58"} Apr 22 17:56:22.552471 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.552424 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" podStartSLOduration=0.782043553 podStartE2EDuration="2.552406832s" podCreationTimestamp="2026-04-22 17:56:20 +0000 UTC" firstStartedPulling="2026-04-22 17:56:20.471547651 +0000 UTC m=+161.950071065" lastFinishedPulling="2026-04-22 17:56:22.241910931 +0000 UTC m=+163.720434344" observedRunningTime="2026-04-22 17:56:22.551077082 +0000 UTC m=+164.029600505" watchObservedRunningTime="2026-04-22 17:56:22.552406832 +0000 UTC m=+164.030930255" Apr 22 17:56:22.861986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.861945 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn"] Apr 22 17:56:22.865419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.865393 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" Apr 22 17:56:22.868347 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.868315 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:22.868595 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.868569 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:56:22.868683 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.868615 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7qp8g\"" Apr 22 17:56:22.873409 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.873380 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn"] Apr 22 17:56:22.977524 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:22.977482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvztw\" (UniqueName: \"kubernetes.io/projected/5080e862-a8c9-455b-8f63-a505b89977b4-kube-api-access-gvztw\") pod \"migrator-74bb7799d9-g44hn\" (UID: \"5080e862-a8c9-455b-8f63-a505b89977b4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" Apr 22 17:56:23.078673 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:23.078634 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvztw\" (UniqueName: \"kubernetes.io/projected/5080e862-a8c9-455b-8f63-a505b89977b4-kube-api-access-gvztw\") pod \"migrator-74bb7799d9-g44hn\" (UID: \"5080e862-a8c9-455b-8f63-a505b89977b4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" Apr 22 17:56:23.090589 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:23.090556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvztw\" (UniqueName: \"kubernetes.io/projected/5080e862-a8c9-455b-8f63-a505b89977b4-kube-api-access-gvztw\") pod \"migrator-74bb7799d9-g44hn\" (UID: \"5080e862-a8c9-455b-8f63-a505b89977b4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" Apr 22 17:56:23.176008 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:23.175932 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" Apr 22 17:56:23.299394 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:23.299362 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn"] Apr 22 17:56:23.303490 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:23.303459 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5080e862_a8c9_455b_8f63_a505b89977b4.slice/crio-588319fcc19750b8a76ff3cd44c494672341f566bafd2e1b5020b3552b1216c2 WatchSource:0}: Error finding container 588319fcc19750b8a76ff3cd44c494672341f566bafd2e1b5020b3552b1216c2: Status 404 returned error can't find the container with id 588319fcc19750b8a76ff3cd44c494672341f566bafd2e1b5020b3552b1216c2 Apr 22 17:56:23.537612 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:23.537524 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" event={"ID":"5080e862-a8c9-455b-8f63-a505b89977b4","Type":"ContainerStarted","Data":"588319fcc19750b8a76ff3cd44c494672341f566bafd2e1b5020b3552b1216c2"} Apr 22 17:56:24.541439 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:24.541402 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" event={"ID":"5080e862-a8c9-455b-8f63-a505b89977b4","Type":"ContainerStarted","Data":"72ffcffbbbafb5a424d687cb2897f4392850ee280fdd9fca0b7e8923d5d2959c"} Apr 22 17:56:24.541797 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:24.541444 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" event={"ID":"5080e862-a8c9-455b-8f63-a505b89977b4","Type":"ContainerStarted","Data":"3d8317877991080a9cdc9fc06a84ccd38c27472da12612e7b123f5238f4a4276"} Apr 22 17:56:24.562002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:24.561947 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-g44hn" podStartSLOduration=1.536254408 podStartE2EDuration="2.561934176s" podCreationTimestamp="2026-04-22 17:56:22 +0000 UTC" firstStartedPulling="2026-04-22 17:56:23.305611228 +0000 UTC m=+164.784134633" lastFinishedPulling="2026-04-22 17:56:24.331290997 +0000 UTC m=+165.809814401" observedRunningTime="2026-04-22 17:56:24.560156079 +0000 UTC m=+166.038679501" watchObservedRunningTime="2026-04-22 17:56:24.561934176 +0000 UTC m=+166.040457599" Apr 22 17:56:25.053264 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:25.053226 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:56:25.803168 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:25.803128 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:25.803180 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:25.803293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:25.803307 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:41.803289921 +0000 UTC m=+183.281813325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : configmap references non-existent config key: service-ca.crt Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:25.803359 2564 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:25.803375 2564 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:25.803423 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls podName:bb31bacb-bc3d-4ab6-93ea-80941f907553 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:41.80340643 +0000 UTC m=+183.281929832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-h5pjc" (UID: "bb31bacb-bc3d-4ab6-93ea-80941f907553") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:56:25.803584 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:25.803443 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs podName:92d95712-bf5b-4abb-8d3c-309ebdebbdd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:41.803433185 +0000 UTC m=+183.281956589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs") pod "router-default-768bdd6654-j6n7q" (UID: "92d95712-bf5b-4abb-8d3c-309ebdebbdd5") : secret "router-metrics-certs-default" not found Apr 22 17:56:27.053243 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:27.053205 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:56:41.838703 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.838671 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:41.839163 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.838720 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:41.839163 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.838780 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:41.839252 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.839237 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-service-ca-bundle\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:41.841169 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.841151 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92d95712-bf5b-4abb-8d3c-309ebdebbdd5-metrics-certs\") pod \"router-default-768bdd6654-j6n7q\" (UID: \"92d95712-bf5b-4abb-8d3c-309ebdebbdd5\") " pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:41.841223 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.841188 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb31bacb-bc3d-4ab6-93ea-80941f907553-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-h5pjc\" (UID: \"bb31bacb-bc3d-4ab6-93ea-80941f907553\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:41.864015 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.863987 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tjk72\"" Apr 22 17:56:41.869535 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.869519 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hbkwr\"" Apr 22 17:56:41.871645 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.871629 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" Apr 22 17:56:41.878376 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.878285 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:41.999672 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:41.999642 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc"] Apr 22 17:56:42.002243 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:42.002214 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb31bacb_bc3d_4ab6_93ea_80941f907553.slice/crio-59a7a8564231298d0414b453963dd853cf8167c3a3dd461987817f01a6e41fcf WatchSource:0}: Error finding container 59a7a8564231298d0414b453963dd853cf8167c3a3dd461987817f01a6e41fcf: Status 404 returned error can't find the container with id 59a7a8564231298d0414b453963dd853cf8167c3a3dd461987817f01a6e41fcf Apr 22 17:56:42.016426 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.016400 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-768bdd6654-j6n7q"] Apr 22 17:56:42.019145 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:42.019119 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d95712_bf5b_4abb_8d3c_309ebdebbdd5.slice/crio-af9bba114f2750190fdba01be9631959b15a264b4e28b650dc9bb3cd0df38150 WatchSource:0}: Error finding container af9bba114f2750190fdba01be9631959b15a264b4e28b650dc9bb3cd0df38150: Status 404 returned error can't find the container with id af9bba114f2750190fdba01be9631959b15a264b4e28b650dc9bb3cd0df38150 Apr 22 17:56:42.587460 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.587369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-768bdd6654-j6n7q" event={"ID":"92d95712-bf5b-4abb-8d3c-309ebdebbdd5","Type":"ContainerStarted","Data":"28e138aee132473d7d575be983330e4d33f5ec1de7c085a8465d85fa09c40538"} Apr 22 17:56:42.587460 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.587410 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-768bdd6654-j6n7q" event={"ID":"92d95712-bf5b-4abb-8d3c-309ebdebbdd5","Type":"ContainerStarted","Data":"af9bba114f2750190fdba01be9631959b15a264b4e28b650dc9bb3cd0df38150"} Apr 22 17:56:42.588396 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.588371 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" event={"ID":"bb31bacb-bc3d-4ab6-93ea-80941f907553","Type":"ContainerStarted","Data":"59a7a8564231298d0414b453963dd853cf8167c3a3dd461987817f01a6e41fcf"} Apr 22 17:56:42.610262 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.610218 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-768bdd6654-j6n7q" podStartSLOduration=32.610204364 podStartE2EDuration="32.610204364s" podCreationTimestamp="2026-04-22 17:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:42.609279025 +0000 UTC m=+184.087802450" watchObservedRunningTime="2026-04-22 17:56:42.610204364 +0000 UTC m=+184.088727787" Apr 22 17:56:42.879254 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.879220 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:42.882121 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:42.882093 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:43.591541 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:43.591494 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:43.592987 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:43.592952 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-768bdd6654-j6n7q" Apr 22 17:56:44.594984 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:44.594939 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" event={"ID":"bb31bacb-bc3d-4ab6-93ea-80941f907553","Type":"ContainerStarted","Data":"06cfd3a9a9cb94b181b3e31cdf03f609fa981c03f7483659aabb55792ac3f07d"} Apr 22 17:56:44.615010 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:44.614964 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-h5pjc" podStartSLOduration=32.859943204 podStartE2EDuration="34.614947055s" podCreationTimestamp="2026-04-22 17:56:10 +0000 UTC" firstStartedPulling="2026-04-22 17:56:42.00415567 +0000 UTC m=+183.482679074" lastFinishedPulling="2026-04-22 17:56:43.759159521 +0000 UTC m=+185.237682925" observedRunningTime="2026-04-22 17:56:44.613606124 +0000 UTC m=+186.092129560" watchObservedRunningTime="2026-04-22 17:56:44.614947055 +0000 UTC m=+186.093470477" Apr 22 17:56:48.264048 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.264013 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9fq46"] Apr 22 17:56:48.267362 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.267343 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.270756 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.270736 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:48.270968 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.270952 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:48.272168 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.272150 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cl79w\"" Apr 22 17:56:48.281898 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.281872 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9fq46"] Apr 22 17:56:48.395522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.395492 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9n2k\" (UniqueName: \"kubernetes.io/projected/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-api-access-f9n2k\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.395673 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.395569 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c96db02a-45a4-45f2-8706-3e634cf21f29-data-volume\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.395673 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.395589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c96db02a-45a4-45f2-8706-3e634cf21f29-crio-socket\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.395673 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.395618 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c96db02a-45a4-45f2-8706-3e634cf21f29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.395802 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.395710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496504 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c96db02a-45a4-45f2-8706-3e634cf21f29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496504 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496487 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496672 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9n2k\" (UniqueName: \"kubernetes.io/projected/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-api-access-f9n2k\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496712 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496674 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c96db02a-45a4-45f2-8706-3e634cf21f29-data-volume\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496712 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496707 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c96db02a-45a4-45f2-8706-3e634cf21f29-crio-socket\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.496818 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.496802 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c96db02a-45a4-45f2-8706-3e634cf21f29-crio-socket\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.497023 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.497003 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c96db02a-45a4-45f2-8706-3e634cf21f29-data-volume\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.497627 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.497610 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.498721 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.498703 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c96db02a-45a4-45f2-8706-3e634cf21f29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.507082 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.507060 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9n2k\" (UniqueName: \"kubernetes.io/projected/c96db02a-45a4-45f2-8706-3e634cf21f29-kube-api-access-f9n2k\") pod \"insights-runtime-extractor-9fq46\" (UID: \"c96db02a-45a4-45f2-8706-3e634cf21f29\") " pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.576708 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.576683 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9fq46" Apr 22 17:56:48.697033 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:48.697000 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9fq46"] Apr 22 17:56:48.700079 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:48.700047 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc96db02a_45a4_45f2_8706_3e634cf21f29.slice/crio-047a0ebcfed995a957c05cc1b19835333aea7567311fbee1137226872af713e4 WatchSource:0}: Error finding container 047a0ebcfed995a957c05cc1b19835333aea7567311fbee1137226872af713e4: Status 404 returned error can't find the container with id 047a0ebcfed995a957c05cc1b19835333aea7567311fbee1137226872af713e4 Apr 22 17:56:49.609001 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:49.608964 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9fq46" event={"ID":"c96db02a-45a4-45f2-8706-3e634cf21f29","Type":"ContainerStarted","Data":"664bad2df841cdba52d48f26f7721a9a70f7d9ae4848f89016be4e27fca7c633"} Apr 22 17:56:49.609001 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:49.609006 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9fq46" event={"ID":"c96db02a-45a4-45f2-8706-3e634cf21f29","Type":"ContainerStarted","Data":"760eda5c8fbcab98599df55420c5e098ead3545eb155f9c5692ca8ed2eb74a9c"} Apr 22 17:56:49.609388 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:49.609018 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9fq46" event={"ID":"c96db02a-45a4-45f2-8706-3e634cf21f29","Type":"ContainerStarted","Data":"047a0ebcfed995a957c05cc1b19835333aea7567311fbee1137226872af713e4"} Apr 22 17:56:51.616156 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:51.616117 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9fq46" event={"ID":"c96db02a-45a4-45f2-8706-3e634cf21f29","Type":"ContainerStarted","Data":"9f7643d144f83a4cc134f0b6f12ffb0926452225b642aa38423e173638f15067"} Apr 22 17:56:51.641746 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:51.641697 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9fq46" podStartSLOduration=1.716561279 podStartE2EDuration="3.641683399s" podCreationTimestamp="2026-04-22 17:56:48 +0000 UTC" firstStartedPulling="2026-04-22 17:56:48.754034041 +0000 UTC m=+190.232557442" lastFinishedPulling="2026-04-22 17:56:50.679156149 +0000 UTC m=+192.157679562" observedRunningTime="2026-04-22 17:56:51.639514084 +0000 UTC m=+193.118037507" watchObservedRunningTime="2026-04-22 17:56:51.641683399 +0000 UTC m=+193.120206850" Apr 22 17:56:56.686701 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.686659 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvcw9"] Apr 22 17:56:56.690525 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.690507 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.693875 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.693836 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:56:56.693994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.693917 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 17:56:56.694940 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.694918 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ncvz9\"" Apr 22 17:56:56.695324 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.695263 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 17:56:56.695324 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.695267 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:56.703877 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.701652 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvcw9"] Apr 22 17:56:56.706087 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.706067 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2znk9"] Apr 22 17:56:56.709190 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.709173 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.711502 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.711481 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:56.711604 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.711506 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:56.711604 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.711522 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7s4kf\"" Apr 22 17:56:56.711604 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.711549 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:56.760495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760467 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7bb8ac7-2202-43de-96a7-145ad6c915e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760496 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760655 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760511 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-sys\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760655 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760533 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-metrics-client-ca\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760655 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760756 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760654 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-textfile\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760756 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760677 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760756 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760731 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-root\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760756 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760747 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-wtmp\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760774 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760809 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760826 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn694\" (UniqueName: \"kubernetes.io/projected/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-api-access-wn694\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.760910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760840 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dr5\" (UniqueName: \"kubernetes.io/projected/6df46903-fedf-4d2c-b525-4b5f8730c544-kube-api-access-92dr5\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.760910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760886 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.761059 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.760943 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-accelerators-collector-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.861992 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.861960 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-accelerators-collector-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862015 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7bb8ac7-2202-43de-96a7-145ad6c915e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862032 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862050 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-sys\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862073 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-metrics-client-ca\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862102 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862161 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-textfile\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862166 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862181 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-sys\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862217 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-root\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-wtmp\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862300 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862325 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn694\" (UniqueName: \"kubernetes.io/projected/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-api-access-wn694\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862351 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92dr5\" (UniqueName: \"kubernetes.io/projected/6df46903-fedf-4d2c-b525-4b5f8730c544-kube-api-access-92dr5\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862380 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.862495 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:56.862439 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862497 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-textfile\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:56.862528 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls podName:6df46903-fedf-4d2c-b525-4b5f8730c544 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:57.362502659 +0000 UTC m=+198.841026074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls") pod "node-exporter-2znk9" (UID: "6df46903-fedf-4d2c-b525-4b5f8730c544") : secret "node-exporter-tls" not found Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862440 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7bb8ac7-2202-43de-96a7-145ad6c915e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:56.862669 2564 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862685 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-accelerators-collector-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:56.862726 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls podName:b7bb8ac7-2202-43de-96a7-145ad6c915e7 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:57.362709901 +0000 UTC m=+198.841233316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xvcw9" (UID: "b7bb8ac7-2202-43de-96a7-145ad6c915e7") : secret "kube-state-metrics-tls" not found Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862775 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df46903-fedf-4d2c-b525-4b5f8730c544-metrics-client-ca\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862854 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-wtmp\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.863037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.862959 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6df46903-fedf-4d2c-b525-4b5f8730c544-root\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.863456 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.863436 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.863596 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.863574 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.864765 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.864740 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:56.864846 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.864809 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.874069 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.874051 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn694\" (UniqueName: \"kubernetes.io/projected/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-api-access-wn694\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:56.874154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:56.874109 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dr5\" (UniqueName: \"kubernetes.io/projected/6df46903-fedf-4d2c-b525-4b5f8730c544-kube-api-access-92dr5\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:57.366219 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.366182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:57.366393 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.366241 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:57.368540 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.368508 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7bb8ac7-2202-43de-96a7-145ad6c915e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvcw9\" (UID: \"b7bb8ac7-2202-43de-96a7-145ad6c915e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:57.368677 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.368570 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6df46903-fedf-4d2c-b525-4b5f8730c544-node-exporter-tls\") pod \"node-exporter-2znk9\" (UID: \"6df46903-fedf-4d2c-b525-4b5f8730c544\") " pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:57.600131 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.600094 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" Apr 22 17:56:57.618073 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.617998 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2znk9" Apr 22 17:56:57.629344 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:57.629309 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df46903_fedf_4d2c_b525_4b5f8730c544.slice/crio-54248d8a8e98e06d004920558fdd09b76833d221b04d1ad2abd38a68ba324532 WatchSource:0}: Error finding container 54248d8a8e98e06d004920558fdd09b76833d221b04d1ad2abd38a68ba324532: Status 404 returned error can't find the container with id 54248d8a8e98e06d004920558fdd09b76833d221b04d1ad2abd38a68ba324532 Apr 22 17:56:57.726991 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.726952 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvcw9"] Apr 22 17:56:57.730959 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:57.730931 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bb8ac7_2202_43de_96a7_145ad6c915e7.slice/crio-f6de7570287c7a14bfe48e10e233ebb62747cad8ff97aab61a1558ef1c4f201f WatchSource:0}: Error finding container f6de7570287c7a14bfe48e10e233ebb62747cad8ff97aab61a1558ef1c4f201f: Status 404 returned error can't find the container with id f6de7570287c7a14bfe48e10e233ebb62747cad8ff97aab61a1558ef1c4f201f Apr 22 17:56:57.749770 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.749524 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:57.754509 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.754493 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.757108 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.757093 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:56:57.757190 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.757093 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:56:57.757672 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.757655 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:56:57.757779 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.757760 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:56:57.758233 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758217 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:56:57.758312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758238 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:56:57.758488 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758474 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:56:57.758657 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758644 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5wsth\"" Apr 22 17:56:57.758731 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758678 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:56:57.758790 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.758772 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:56:57.770296 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770274 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770390 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770323 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770390 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770377 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770496 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770452 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770496 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770489 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5n44\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770592 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770592 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770559 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770672 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770591 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770672 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770617 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770743 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770712 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770743 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770738 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770817 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770753 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.770817 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.770769 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.775778 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.775756 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:57.871394 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871394 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871354 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871394 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871372 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871394 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871421 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871458 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871481 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871524 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5n44\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:57.871617 2564 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871637 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.871679 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.871668 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.872149 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:57.871751 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls podName:d578c48e-d4be-4f1a-bfa4-022240000a28 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:58.371729888 +0000 UTC m=+199.850253289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28") : secret "alertmanager-main-tls" not found Apr 22 17:56:57.873709 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:56:57.872424 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle podName:d578c48e-d4be-4f1a-bfa4-022240000a28 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:58.372406796 +0000 UTC m=+199.850930212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28") : configmap references non-existent config key: ca-bundle.crt Apr 22 17:56:57.873709 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.872711 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.873709 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.873321 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876090 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.874439 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876090 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.875522 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876090 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.875912 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876620 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.876547 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876742 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.876652 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876811 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.876757 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876855 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.876810 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.876970 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.876951 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:57.883752 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:57.883727 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5n44\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.376684 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.376656 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.376904 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.376805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.377665 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.377637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.380060 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.380030 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.635231 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.634852 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" event={"ID":"b7bb8ac7-2202-43de-96a7-145ad6c915e7","Type":"ContainerStarted","Data":"f6de7570287c7a14bfe48e10e233ebb62747cad8ff97aab61a1558ef1c4f201f"} Apr 22 17:56:58.638150 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.638113 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2znk9" event={"ID":"6df46903-fedf-4d2c-b525-4b5f8730c544","Type":"ContainerStarted","Data":"fa62e56f15490051e0962eb229968669834a1da6589b40fa54b94279d8e2fc72"} Apr 22 17:56:58.638292 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.638162 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2znk9" event={"ID":"6df46903-fedf-4d2c-b525-4b5f8730c544","Type":"ContainerStarted","Data":"54248d8a8e98e06d004920558fdd09b76833d221b04d1ad2abd38a68ba324532"} Apr 22 17:56:58.665544 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.665509 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:58.942571 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:58.942548 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:58.943095 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:56:58.943072 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd578c48e_d4be_4f1a_bfa4_022240000a28.slice/crio-f797246eb700e535b3cf7d41412f0c588f796656e8ae18d8eef53f1194f5ac9c WatchSource:0}: Error finding container f797246eb700e535b3cf7d41412f0c588f796656e8ae18d8eef53f1194f5ac9c: Status 404 returned error can't find the container with id f797246eb700e535b3cf7d41412f0c588f796656e8ae18d8eef53f1194f5ac9c Apr 22 17:56:59.642697 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.642659 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" event={"ID":"b7bb8ac7-2202-43de-96a7-145ad6c915e7","Type":"ContainerStarted","Data":"23f0f5adb0ba10063c1100e467709635936d8831e266b2cb4abf61f1a5671df0"} Apr 22 17:56:59.642697 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.642701 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" event={"ID":"b7bb8ac7-2202-43de-96a7-145ad6c915e7","Type":"ContainerStarted","Data":"32545f6f6739afa7a8a26ab898096c8122ec95eb94a3e0bfe5b98b0a98333e80"} Apr 22 17:56:59.642949 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.642714 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" event={"ID":"b7bb8ac7-2202-43de-96a7-145ad6c915e7","Type":"ContainerStarted","Data":"aa3a8685c5851fde7702523d5074d87099de997cc8485d7fdcdb47e1cd84cd1f"} Apr 22 17:56:59.644150 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.644120 2564 generic.go:358] "Generic (PLEG): container finished" podID="6df46903-fedf-4d2c-b525-4b5f8730c544" containerID="fa62e56f15490051e0962eb229968669834a1da6589b40fa54b94279d8e2fc72" exitCode=0 Apr 22 17:56:59.644273 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.644174 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2znk9" event={"ID":"6df46903-fedf-4d2c-b525-4b5f8730c544","Type":"ContainerDied","Data":"fa62e56f15490051e0962eb229968669834a1da6589b40fa54b94279d8e2fc72"} Apr 22 17:56:59.645446 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.645417 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"f797246eb700e535b3cf7d41412f0c588f796656e8ae18d8eef53f1194f5ac9c"} Apr 22 17:56:59.697641 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.697583 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvcw9" podStartSLOduration=2.56883112 podStartE2EDuration="3.697564972s" podCreationTimestamp="2026-04-22 17:56:56 +0000 UTC" firstStartedPulling="2026-04-22 17:56:57.732753935 +0000 UTC m=+199.211277339" lastFinishedPulling="2026-04-22 17:56:58.86148779 +0000 UTC m=+200.340011191" observedRunningTime="2026-04-22 17:56:59.671347625 +0000 UTC m=+201.149871070" watchObservedRunningTime="2026-04-22 17:56:59.697564972 +0000 UTC m=+201.176088395" Apr 22 17:56:59.698812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.698792 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-775755d68c-n8jrt"] Apr 22 17:56:59.703193 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.703177 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.705779 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.705759 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 17:56:59.705900 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.705793 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 17:56:59.706308 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.706288 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 17:56:59.706382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.706362 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 17:56:59.706637 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.706620 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-d8q5j\"" Apr 22 17:56:59.706792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.706774 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-cbldrhffrjdvl\"" Apr 22 17:56:59.706882 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.706820 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 17:56:59.723040 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.723019 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-775755d68c-n8jrt"] Apr 22 17:56:59.791154 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791114 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791301 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791160 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-grpc-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791301 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791251 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5916f292-3d76-4f55-81bb-25b2585feda7-metrics-client-ca\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791396 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791526 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791478 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791525 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.791636 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.791598 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxp72\" (UniqueName: \"kubernetes.io/projected/5916f292-3d76-4f55-81bb-25b2585feda7-kube-api-access-wxp72\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893009 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.892846 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893143 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.892994 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-grpc-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893199 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.893144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893199 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.893191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5916f292-3d76-4f55-81bb-25b2585feda7-metrics-client-ca\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893289 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.893235 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893289 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.893279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.893997 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.893969 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.894130 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.894040 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxp72\" (UniqueName: \"kubernetes.io/projected/5916f292-3d76-4f55-81bb-25b2585feda7-kube-api-access-wxp72\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.894196 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.894167 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5916f292-3d76-4f55-81bb-25b2585feda7-metrics-client-ca\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.896055 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.896005 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-grpc-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.896460 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.896437 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.896532 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.896505 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.896790 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.896765 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.897709 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.897684 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.897910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.897890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5916f292-3d76-4f55-81bb-25b2585feda7-secret-thanos-querier-tls\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:56:59.902963 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:56:59.902942 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxp72\" (UniqueName: \"kubernetes.io/projected/5916f292-3d76-4f55-81bb-25b2585feda7-kube-api-access-wxp72\") pod \"thanos-querier-775755d68c-n8jrt\" (UID: \"5916f292-3d76-4f55-81bb-25b2585feda7\") " pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:57:00.012985 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.012945 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:57:00.321686 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.321660 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-775755d68c-n8jrt"] Apr 22 17:57:00.324240 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:57:00.324210 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5916f292_3d76_4f55_81bb_25b2585feda7.slice/crio-c740eb3c0c09d7514610d415502c0920ba512ac889b2bd0d6bef17c87352c569 WatchSource:0}: Error finding container c740eb3c0c09d7514610d415502c0920ba512ac889b2bd0d6bef17c87352c569: Status 404 returned error can't find the container with id c740eb3c0c09d7514610d415502c0920ba512ac889b2bd0d6bef17c87352c569 Apr 22 17:57:00.649424 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.649387 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2znk9" event={"ID":"6df46903-fedf-4d2c-b525-4b5f8730c544","Type":"ContainerStarted","Data":"05c66fd44dfb60c0dbd1400a8c0125c37157415d4977b318af09bcc6806cd6ee"} Apr 22 17:57:00.649584 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.649430 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2znk9" event={"ID":"6df46903-fedf-4d2c-b525-4b5f8730c544","Type":"ContainerStarted","Data":"5d8ea0c18ff57efb1cfd9534070f1d2ed1bd6f7b8952024402abe21132246a8f"} Apr 22 17:57:00.650419 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.650392 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"c740eb3c0c09d7514610d415502c0920ba512ac889b2bd0d6bef17c87352c569"} Apr 22 17:57:00.651508 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.651484 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff" exitCode=0 Apr 22 17:57:00.651616 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.651566 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff"} Apr 22 17:57:00.671815 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:00.671756 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2znk9" podStartSLOduration=3.935917752 podStartE2EDuration="4.671738581s" podCreationTimestamp="2026-04-22 17:56:56 +0000 UTC" firstStartedPulling="2026-04-22 17:56:57.63157562 +0000 UTC m=+199.110099029" lastFinishedPulling="2026-04-22 17:56:58.367396444 +0000 UTC m=+199.845919858" observedRunningTime="2026-04-22 17:57:00.67060014 +0000 UTC m=+202.149123574" watchObservedRunningTime="2026-04-22 17:57:00.671738581 +0000 UTC m=+202.150262005" Apr 22 17:57:02.660235 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.660176 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"caa9dca95b182d596794f43414ab838166a89b6cec8cfe9daafe6c6de5b38bf9"} Apr 22 17:57:02.660235 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.660219 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"8a07cd5e5a715c20170001cefe1586b1bf8a64a2bc98ae954c3d76d38fc8be21"} Apr 22 17:57:02.660235 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.660234 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"17e912cf62774dba6e9d06ffd20e165cc71cbe7409c20ec7fc875f455de28872"} Apr 22 17:57:02.662450 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.662426 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688"} Apr 22 17:57:02.662450 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.662456 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31"} Apr 22 17:57:02.662612 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.662465 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a"} Apr 22 17:57:02.662612 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.662474 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f"} Apr 22 17:57:02.988355 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.988279 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:02.991807 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.991792 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:02.994525 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.994503 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:57:02.994652 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.994607 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:57:02.994853 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.994839 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:57:02.994942 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.994838 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fmjgopp7k07jq\"" Apr 22 17:57:02.995599 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.995584 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:57:02.995819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.995799 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:57:02.995921 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.995841 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.997138 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.997812 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.998122 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hhcvh\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.998555 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:02.999593 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.000018 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:57:03.002151 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.000678 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:57:03.003501 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.003481 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:57:03.008675 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.008655 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:03.025664 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025639 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025759 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025838 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025838 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025781 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025838 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025838 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025819 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025897 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025940 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025954 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.025989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.025968 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026139 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026018 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026139 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzxl8\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026139 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026139 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026126 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026141 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.026263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.026181 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.126847 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.126813 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127024 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.126897 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127024 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.126940 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127024 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.126969 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127024 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.126998 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127029 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127111 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127149 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127218 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127253 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127251 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzxl8\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127353 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127586 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127418 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.127764 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.127674 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.128542 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.128014 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.128542 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.128495 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.130907 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.130627 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.131886 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.131029 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.131886 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.131548 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.131886 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.131842 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.132140 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.131846 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.132140 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.131967 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.132698 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.132610 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.132984 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.132959 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.134217 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.134198 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.134486 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.134441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.135094 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.135053 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.136108 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.136064 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.136191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.136150 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.136989 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.136956 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.143206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.143169 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzxl8\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8\") pod \"prometheus-k8s-0\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.304983 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.304903 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:03.512037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.512009 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:03.516980 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:57:03.516946 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce9887d_e613_4031_b32d_7e8e61da9ca7.slice/crio-b0f3be24ec347d64ce811cfe1c4e0bf08270a2317497c5d295e3372d463e7929 WatchSource:0}: Error finding container b0f3be24ec347d64ce811cfe1c4e0bf08270a2317497c5d295e3372d463e7929: Status 404 returned error can't find the container with id b0f3be24ec347d64ce811cfe1c4e0bf08270a2317497c5d295e3372d463e7929 Apr 22 17:57:03.668122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.668090 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"9bd8f565e19003a8d279076203b0f36a8c2ea641cbb738386cd3fc8c8c80a9b3"} Apr 22 17:57:03.668522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.668130 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"56dbf389f64cfa3f5f5751df77941ec9e4388223a64e43498ca1e3f9aadf6534"} Apr 22 17:57:03.668522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.668145 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" event={"ID":"5916f292-3d76-4f55-81bb-25b2585feda7","Type":"ContainerStarted","Data":"e256f4b44292f23d7d3525cf1503fd1d220a1f489297e8b07a86483387bf3bc9"} Apr 22 17:57:03.668522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.668241 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:57:03.670925 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.670897 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a"} Apr 22 17:57:03.671046 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.670930 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerStarted","Data":"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01"} Apr 22 17:57:03.672310 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.672279 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="da369aa023f315276ae065533960ff7c03db04d28284559e1729ff6fac441ad0" exitCode=0 Apr 22 17:57:03.672403 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.672321 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"da369aa023f315276ae065533960ff7c03db04d28284559e1729ff6fac441ad0"} Apr 22 17:57:03.672403 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.672350 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"b0f3be24ec347d64ce811cfe1c4e0bf08270a2317497c5d295e3372d463e7929"} Apr 22 17:57:03.690233 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.690196 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" podStartSLOduration=1.6667297140000001 podStartE2EDuration="4.690166788s" podCreationTimestamp="2026-04-22 17:56:59 +0000 UTC" firstStartedPulling="2026-04-22 17:57:00.326197033 +0000 UTC m=+201.804720438" lastFinishedPulling="2026-04-22 17:57:03.349634097 +0000 UTC m=+204.828157512" observedRunningTime="2026-04-22 17:57:03.689121266 +0000 UTC m=+205.167644714" watchObservedRunningTime="2026-04-22 17:57:03.690166788 +0000 UTC m=+205.168690213" Apr 22 17:57:03.720801 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:03.720758 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.312262126 podStartE2EDuration="6.720746927s" podCreationTimestamp="2026-04-22 17:56:57 +0000 UTC" firstStartedPulling="2026-04-22 17:56:58.945264182 +0000 UTC m=+200.423787599" lastFinishedPulling="2026-04-22 17:57:03.353748993 +0000 UTC m=+204.832272400" observedRunningTime="2026-04-22 17:57:03.718755934 +0000 UTC m=+205.197279358" watchObservedRunningTime="2026-04-22 17:57:03.720746927 +0000 UTC m=+205.199270349" Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"8398240f08fcd471760f8bdbb422d057959e0b48fd161fcebafc4409694b1ce1"} Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686228 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"4756a4f4cfee97ca7f0824b4045208a5278e3351f7245c85062795124e0c7959"} Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686240 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"6fa81960161a48cc8303b604f6232707ea9981dab2a3225c27962788f4895ac1"} Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686248 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"96900e6fd2f3c2a97b932264172b07e85df0ea0c73e24044aab76b661121615f"} Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686256 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"b487639879de08c486908e78bc3154289831ca1a61a18bb81eda78476abd6df3"} Apr 22 17:57:06.686282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:06.686266 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerStarted","Data":"ab6f4378d073fe3762db588cfb57af06b9038d87fd3ce04fbff034f26fcbd125"} Apr 22 17:57:08.305752 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:08.305713 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:09.682286 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:09.682257 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-775755d68c-n8jrt" Apr 22 17:57:09.708460 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:09.708413 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.256396428 podStartE2EDuration="7.708398509s" podCreationTimestamp="2026-04-22 17:57:02 +0000 UTC" firstStartedPulling="2026-04-22 17:57:03.673371021 +0000 UTC m=+205.151894426" lastFinishedPulling="2026-04-22 17:57:06.125373106 +0000 UTC m=+207.603896507" observedRunningTime="2026-04-22 17:57:06.721414873 +0000 UTC m=+208.199938297" watchObservedRunningTime="2026-04-22 17:57:09.708398509 +0000 UTC m=+211.186921939" Apr 22 17:57:10.531114 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.531081 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c57866bf4-cx2k4"] Apr 22 17:57:10.531304 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:57:10.531286 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" podUID="966c0818-a3ff-4930-86fc-27b6ab381b1a" Apr 22 17:57:10.697055 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.697024 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:57:10.701811 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.701789 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:57:10.801111 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801035 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801111 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801087 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801111 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801113 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801335 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801137 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801335 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801157 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801335 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801211 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801335 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801236 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dtpb\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb\") pod \"966c0818-a3ff-4930-86fc-27b6ab381b1a\" (UID: \"966c0818-a3ff-4930-86fc-27b6ab381b1a\") " Apr 22 17:57:10.801537 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801436 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:10.801595 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801548 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/966c0818-a3ff-4930-86fc-27b6ab381b1a-ca-trust-extracted\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.801595 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801571 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:10.801595 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.801583 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:10.803570 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.803528 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:10.803570 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.803534 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb" (OuterVolumeSpecName: "kube-api-access-5dtpb") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "kube-api-access-5dtpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:10.803740 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.803581 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:10.803740 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.803601 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "966c0818-a3ff-4930-86fc-27b6ab381b1a" (UID: "966c0818-a3ff-4930-86fc-27b6ab381b1a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:10.902468 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902441 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-image-registry-private-configuration\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.902468 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902465 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-bound-sa-token\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.902631 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902476 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-certificates\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.902631 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902487 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/966c0818-a3ff-4930-86fc-27b6ab381b1a-installation-pull-secrets\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.902631 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902502 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dtpb\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-kube-api-access-5dtpb\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:10.902631 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:10.902511 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/966c0818-a3ff-4930-86fc-27b6ab381b1a-trusted-ca\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:11.699536 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:11.699503 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c57866bf4-cx2k4" Apr 22 17:57:11.733056 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:11.733029 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c57866bf4-cx2k4"] Apr 22 17:57:11.736930 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:11.736904 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c57866bf4-cx2k4"] Apr 22 17:57:11.810796 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:11.810770 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/966c0818-a3ff-4930-86fc-27b6ab381b1a-registry-tls\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:57:13.061330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:13.061298 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966c0818-a3ff-4930-86fc-27b6ab381b1a" path="/var/lib/kubelet/pods/966c0818-a3ff-4930-86fc-27b6ab381b1a/volumes" Apr 22 17:57:23.735952 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:23.735909 2564 generic.go:358] "Generic (PLEG): container finished" podID="fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35" containerID="5a4f44f160a74f12ce9da463423db1d4a6c49326df29beac74902cf511fe756f" exitCode=0 Apr 22 17:57:23.736353 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:23.735970 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" event={"ID":"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35","Type":"ContainerDied","Data":"5a4f44f160a74f12ce9da463423db1d4a6c49326df29beac74902cf511fe756f"} Apr 22 17:57:23.736353 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:23.736301 2564 scope.go:117] "RemoveContainer" containerID="5a4f44f160a74f12ce9da463423db1d4a6c49326df29beac74902cf511fe756f" Apr 22 17:57:24.740653 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:24.740617 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g6jpq" event={"ID":"fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35","Type":"ContainerStarted","Data":"d648add4cc2f5f09d03059fbbc0b60acd836b51af17505a3faa9fd7864067104"} Apr 22 17:57:25.907189 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:25.907164 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gglf5_494d8548-161a-40c6-aa4f-a43f0cb0ff07/dns-node-resolver/0.log" Apr 22 17:57:26.307625 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:26.307543 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-768bdd6654-j6n7q_92d95712-bf5b-4abb-8d3c-309ebdebbdd5/router/0.log" Apr 22 17:57:38.783029 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:38.782994 2564 generic.go:358] "Generic (PLEG): container finished" podID="4c992423-78aa-47f2-9542-6fc88fefda4b" containerID="106d0527e082ba9a55399e9d4de109e3e590f712068ff0fac4f6f2a717cadd58" exitCode=0 Apr 22 17:57:38.783499 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:38.783059 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" event={"ID":"4c992423-78aa-47f2-9542-6fc88fefda4b","Type":"ContainerDied","Data":"106d0527e082ba9a55399e9d4de109e3e590f712068ff0fac4f6f2a717cadd58"} Apr 22 17:57:38.783499 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:38.783459 2564 scope.go:117] "RemoveContainer" containerID="106d0527e082ba9a55399e9d4de109e3e590f712068ff0fac4f6f2a717cadd58" Apr 22 17:57:39.787529 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:39.787489 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cnm65" event={"ID":"4c992423-78aa-47f2-9542-6fc88fefda4b","Type":"ContainerStarted","Data":"3962dfb66bf3f5c35b148183b53287777b8bb18ace07ae43053178369619b819"} Apr 22 17:57:42.796331 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:42.796299 2564 generic.go:358] "Generic (PLEG): container finished" podID="ec52f02a-c723-4c97-a74b-1348c7d84b33" containerID="7f8cb1632cd6afa792da346b07df0d89b35f833dee67f64e2e40f0f6eaa948ed" exitCode=0 Apr 22 17:57:42.796728 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:42.796371 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" event={"ID":"ec52f02a-c723-4c97-a74b-1348c7d84b33","Type":"ContainerDied","Data":"7f8cb1632cd6afa792da346b07df0d89b35f833dee67f64e2e40f0f6eaa948ed"} Apr 22 17:57:42.796728 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:42.796685 2564 scope.go:117] "RemoveContainer" containerID="7f8cb1632cd6afa792da346b07df0d89b35f833dee67f64e2e40f0f6eaa948ed" Apr 22 17:57:43.801251 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:43.801220 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jpj98" event={"ID":"ec52f02a-c723-4c97-a74b-1348c7d84b33","Type":"ContainerStarted","Data":"e72f0478c88d14e15de08bdbe56461cdd6d97740c361b6d351945fde225d18cb"} Apr 22 17:57:49.868233 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:49.868193 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:57:49.870539 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:49.870515 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c1e2467-3796-47ad-928c-f82f435261e9-metrics-certs\") pod \"network-metrics-daemon-wcgxk\" (UID: \"2c1e2467-3796-47ad-928c-f82f435261e9\") " pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:57:49.956728 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:49.956698 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:57:49.964994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:49.964965 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcgxk" Apr 22 17:57:50.086054 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:50.085963 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wcgxk"] Apr 22 17:57:50.088739 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:57:50.088704 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1e2467_3796_47ad_928c_f82f435261e9.slice/crio-dc8fae5567c9cf69af76751fd9980d5bad385e1defcf6f4e03871d153cbde726 WatchSource:0}: Error finding container dc8fae5567c9cf69af76751fd9980d5bad385e1defcf6f4e03871d153cbde726: Status 404 returned error can't find the container with id dc8fae5567c9cf69af76751fd9980d5bad385e1defcf6f4e03871d153cbde726 Apr 22 17:57:50.827609 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:50.827561 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcgxk" event={"ID":"2c1e2467-3796-47ad-928c-f82f435261e9","Type":"ContainerStarted","Data":"dc8fae5567c9cf69af76751fd9980d5bad385e1defcf6f4e03871d153cbde726"} Apr 22 17:57:51.832467 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:51.832425 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcgxk" event={"ID":"2c1e2467-3796-47ad-928c-f82f435261e9","Type":"ContainerStarted","Data":"62103e174b005d0c9bfa4a1bd234f3f34c9125bdc2798e51a80d24a8f8b30042"} Apr 22 17:57:51.832844 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:51.832473 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcgxk" event={"ID":"2c1e2467-3796-47ad-928c-f82f435261e9","Type":"ContainerStarted","Data":"3513ac79fafb40657e303b1a00414a0efec36a39390f9a94f8918f2a78166175"} Apr 22 17:57:51.848830 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:57:51.848785 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wcgxk" podStartSLOduration=251.758952194 podStartE2EDuration="4m12.848772476s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:57:50.090508586 +0000 UTC m=+251.569031991" lastFinishedPulling="2026-04-22 17:57:51.180328855 +0000 UTC m=+252.658852273" observedRunningTime="2026-04-22 17:57:51.847566295 +0000 UTC m=+253.326089718" watchObservedRunningTime="2026-04-22 17:57:51.848772476 +0000 UTC m=+253.327295899" Apr 22 17:58:03.305805 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:03.305766 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:03.321653 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:03.321623 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:03.882206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:03.882179 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:16.513260 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:16.513208 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" podUID="da61e571-00a2-4ad8-86ad-1156286a7409" Apr 22 17:58:16.513260 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:16.513209 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sgmcw" podUID="4b38c080-af4d-4d73-ad1f-c364849e7212" Apr 22 17:58:16.905387 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:16.905355 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:58:16.906082 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:16.906060 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:17.038278 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.038233 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:17.038976 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.038933 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-metric" containerID="cri-o://f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" gracePeriod=120 Apr 22 17:58:17.039174 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.039134 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="prom-label-proxy" containerID="cri-o://177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" gracePeriod=120 Apr 22 17:58:17.039471 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.039423 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="config-reloader" containerID="cri-o://dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" gracePeriod=120 Apr 22 17:58:17.039617 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.039462 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy" containerID="cri-o://b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" gracePeriod=120 Apr 22 17:58:17.039690 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.039617 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-web" containerID="cri-o://9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" gracePeriod=120 Apr 22 17:58:17.040193 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.039452 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="alertmanager" containerID="cri-o://efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" gracePeriod=120 Apr 22 17:58:17.911352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911317 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" exitCode=0 Apr 22 17:58:17.911352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911344 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" exitCode=0 Apr 22 17:58:17.911352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911350 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" exitCode=0 Apr 22 17:58:17.911352 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911356 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" exitCode=0 Apr 22 17:58:17.911804 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911392 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a"} Apr 22 17:58:17.911804 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911427 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688"} Apr 22 17:58:17.911804 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911439 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a"} Apr 22 17:58:17.911804 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:17.911448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f"} Apr 22 17:58:18.276522 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.276497 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:18.325133 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325102 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325279 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325148 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5n44\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325279 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325179 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325279 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325251 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325284 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325452 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325587 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325510 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325587 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325542 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325587 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325572 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325732 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325595 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:18.325732 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325609 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325732 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325661 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:18.325732 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325681 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325958 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325731 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.325958 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.325767 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db\") pod \"d578c48e-d4be-4f1a-bfa4-022240000a28\" (UID: \"d578c48e-d4be-4f1a-bfa4-022240000a28\") " Apr 22 17:58:18.326137 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.326114 2564 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-metrics-client-ca\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.326222 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.326144 2564 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.326469 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.326446 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:18.329210 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.329181 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44" (OuterVolumeSpecName: "kube-api-access-d5n44") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "kube-api-access-d5n44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:18.330219 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330099 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:18.330219 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330121 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.330219 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330190 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.330219 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330212 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out" (OuterVolumeSpecName: "config-out") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:18.330494 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330470 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.330778 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.330753 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.332155 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.332117 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume" (OuterVolumeSpecName: "config-volume") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.334570 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.334503 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.341160 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.341136 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config" (OuterVolumeSpecName: "web-config") pod "d578c48e-d4be-4f1a-bfa4-022240000a28" (UID: "d578c48e-d4be-4f1a-bfa4-022240000a28"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:18.427271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427226 2564 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-tls-assets\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427264 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427275 2564 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-alertmanager-main-db\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427285 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5n44\" (UniqueName: \"kubernetes.io/projected/d578c48e-d4be-4f1a-bfa4-022240000a28-kube-api-access-d5n44\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427295 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-main-tls\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427304 2564 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d578c48e-d4be-4f1a-bfa4-022240000a28-config-out\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427314 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427323 2564 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427335 2564 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-cluster-tls-config\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427344 2564 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-config-volume\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.427514 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.427353 2564 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d578c48e-d4be-4f1a-bfa4-022240000a28-web-config\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:18.917265 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917233 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" exitCode=0 Apr 22 17:58:18.917265 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917258 2564 generic.go:358] "Generic (PLEG): container finished" podID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerID="9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" exitCode=0 Apr 22 17:58:18.917681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917278 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01"} Apr 22 17:58:18.917681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917309 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31"} Apr 22 17:58:18.917681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917319 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d578c48e-d4be-4f1a-bfa4-022240000a28","Type":"ContainerDied","Data":"f797246eb700e535b3cf7d41412f0c588f796656e8ae18d8eef53f1194f5ac9c"} Apr 22 17:58:18.917681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917335 2564 scope.go:117] "RemoveContainer" containerID="177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" Apr 22 17:58:18.917681 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.917334 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:18.924770 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.924750 2564 scope.go:117] "RemoveContainer" containerID="f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" Apr 22 17:58:18.931883 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.931850 2564 scope.go:117] "RemoveContainer" containerID="b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" Apr 22 17:58:18.937994 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.937971 2564 scope.go:117] "RemoveContainer" containerID="9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" Apr 22 17:58:18.941916 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.941818 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:18.945284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.945258 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:18.946326 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.946310 2564 scope.go:117] "RemoveContainer" containerID="dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" Apr 22 17:58:18.952781 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.952765 2564 scope.go:117] "RemoveContainer" containerID="efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" Apr 22 17:58:18.959271 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.959253 2564 scope.go:117] "RemoveContainer" containerID="43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff" Apr 22 17:58:18.966228 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.966205 2564 scope.go:117] "RemoveContainer" containerID="177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" Apr 22 17:58:18.966508 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.966475 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a\": container with ID starting with 177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a not found: ID does not exist" containerID="177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" Apr 22 17:58:18.966571 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.966519 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a"} err="failed to get container status \"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a\": rpc error: code = NotFound desc = could not find container \"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a\": container with ID starting with 177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a not found: ID does not exist" Apr 22 17:58:18.966617 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.966553 2564 scope.go:117] "RemoveContainer" containerID="f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" Apr 22 17:58:18.966781 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.966765 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01\": container with ID starting with f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01 not found: ID does not exist" containerID="f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" Apr 22 17:58:18.966827 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.966785 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01"} err="failed to get container status \"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01\": rpc error: code = NotFound desc = could not find container \"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01\": container with ID starting with f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01 not found: ID does not exist" Apr 22 17:58:18.966827 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.966798 2564 scope.go:117] "RemoveContainer" containerID="b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" Apr 22 17:58:18.969849 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.967214 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688\": container with ID starting with b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688 not found: ID does not exist" containerID="b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" Apr 22 17:58:18.969849 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.967241 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688"} err="failed to get container status \"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688\": rpc error: code = NotFound desc = could not find container \"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688\": container with ID starting with b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688 not found: ID does not exist" Apr 22 17:58:18.969849 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.967258 2564 scope.go:117] "RemoveContainer" containerID="9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" Apr 22 17:58:18.970127 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.970005 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31\": container with ID starting with 9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31 not found: ID does not exist" containerID="9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" Apr 22 17:58:18.970127 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.970038 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31"} err="failed to get container status \"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31\": rpc error: code = NotFound desc = could not find container \"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31\": container with ID starting with 9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31 not found: ID does not exist" Apr 22 17:58:18.970127 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.970062 2564 scope.go:117] "RemoveContainer" containerID="dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" Apr 22 17:58:18.970731 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.970569 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a\": container with ID starting with dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a not found: ID does not exist" containerID="dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" Apr 22 17:58:18.970731 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.970600 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a"} err="failed to get container status \"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a\": rpc error: code = NotFound desc = could not find container \"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a\": container with ID starting with dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a not found: ID does not exist" Apr 22 17:58:18.970731 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.970622 2564 scope.go:117] "RemoveContainer" containerID="efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" Apr 22 17:58:18.971035 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.970984 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f\": container with ID starting with efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f not found: ID does not exist" containerID="efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" Apr 22 17:58:18.971035 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971023 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f"} err="failed to get container status \"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f\": rpc error: code = NotFound desc = could not find container \"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f\": container with ID starting with efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f not found: ID does not exist" Apr 22 17:58:18.971176 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971048 2564 scope.go:117] "RemoveContainer" containerID="43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff" Apr 22 17:58:18.971339 ip-10-0-143-11 kubenswrapper[2564]: E0422 17:58:18.971301 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff\": container with ID starting with 43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff not found: ID does not exist" containerID="43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff" Apr 22 17:58:18.971431 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971364 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff"} err="failed to get container status \"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff\": rpc error: code = NotFound desc = could not find container \"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff\": container with ID starting with 43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff not found: ID does not exist" Apr 22 17:58:18.971431 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971384 2564 scope.go:117] "RemoveContainer" containerID="177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a" Apr 22 17:58:18.971684 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971661 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a"} err="failed to get container status \"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a\": rpc error: code = NotFound desc = could not find container \"177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a\": container with ID starting with 177b7d7fa540cad2fbb2d27376ded62d9c3fcbee98d5f6576c04f97afd94068a not found: ID does not exist" Apr 22 17:58:18.971684 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971683 2564 scope.go:117] "RemoveContainer" containerID="f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01" Apr 22 17:58:18.972037 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.971996 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01"} err="failed to get container status \"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01\": rpc error: code = NotFound desc = could not find container \"f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01\": container with ID starting with f1066c0e1b6c76c2ea100fa5adc34e2c9ca1617d66d7adf6299c9f201120fa01 not found: ID does not exist" Apr 22 17:58:18.972140 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972044 2564 scope.go:117] "RemoveContainer" containerID="b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688" Apr 22 17:58:18.972302 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972277 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688"} err="failed to get container status \"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688\": rpc error: code = NotFound desc = could not find container \"b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688\": container with ID starting with b34117d80bd5e16fd4aa3461f5a8eb9375217a585ea95770f645501f3d4c9688 not found: ID does not exist" Apr 22 17:58:18.972387 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972304 2564 scope.go:117] "RemoveContainer" containerID="9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31" Apr 22 17:58:18.972532 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972505 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:18.972636 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972617 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31"} err="failed to get container status \"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31\": rpc error: code = NotFound desc = could not find container \"9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31\": container with ID starting with 9888d42f0d18085f13de02367028b8b6e04866b0fc7ddff973b4badf3a442f31 not found: ID does not exist" Apr 22 17:58:18.972694 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972637 2564 scope.go:117] "RemoveContainer" containerID="dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a" Apr 22 17:58:18.972922 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972897 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a"} err="failed to get container status \"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a\": rpc error: code = NotFound desc = could not find container \"dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a\": container with ID starting with dd7affed86a76ea6e37983f0bb4bd7e155841dcb4cceaa00adc81f370d3c085a not found: ID does not exist" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972922 2564 scope.go:117] "RemoveContainer" containerID="efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972939 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-web" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972953 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-web" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972975 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972980 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy" Apr 22 17:58:18.972986 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972987 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="alertmanager" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.972993 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="alertmanager" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973000 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="config-reloader" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973005 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="config-reloader" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973011 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-metric" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973016 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-metric" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973022 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="prom-label-proxy" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973029 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="prom-label-proxy" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973041 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="init-config-reloader" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973047 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="init-config-reloader" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973107 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-metric" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973117 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="config-reloader" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973123 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="alertmanager" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973132 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="prom-label-proxy" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973138 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy" Apr 22 17:58:18.973191 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973144 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" containerName="kube-rbac-proxy-web" Apr 22 17:58:18.973652 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973137 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f"} err="failed to get container status \"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f\": rpc error: code = NotFound desc = could not find container \"efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f\": container with ID starting with efdcf08b0a7b9ca600681fd2817e526b8fc0844533a55205c7ffe36826000a0f not found: ID does not exist" Apr 22 17:58:18.973652 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973227 2564 scope.go:117] "RemoveContainer" containerID="43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff" Apr 22 17:58:18.973652 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.973435 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff"} err="failed to get container status \"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff\": rpc error: code = NotFound desc = could not find container \"43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff\": container with ID starting with 43d2c132d731387c2f476ea6ab8903afad99fe2a0ec922c250ee4596d16225ff not found: ID does not exist" Apr 22 17:58:18.978001 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.977978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:18.980792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980771 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:58:18.980920 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980818 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:58:18.980920 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980771 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:58:18.980920 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980889 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:58:18.980920 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980904 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:58:18.981128 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980827 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:58:18.981128 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.980960 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5wsth\"" Apr 22 17:58:18.981128 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.981002 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:58:18.981128 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.981005 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:58:18.987277 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.987252 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:58:18.993533 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:18.991027 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:19.033188 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033218 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-out\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033241 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033263 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033293 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033330 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033318 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033345 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033363 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsvn\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-kube-api-access-gwsvn\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033481 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033437 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033598 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033480 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033598 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033504 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.033598 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.033530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-web-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.057831 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.057802 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d578c48e-d4be-4f1a-bfa4-022240000a28" path="/var/lib/kubelet/pods/d578c48e-d4be-4f1a-bfa4-022240000a28/volumes" Apr 22 17:58:19.134367 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-out\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134367 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134367 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134534 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134534 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134426 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134612 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134556 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134668 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134618 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134668 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134650 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsvn\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-kube-api-access-gwsvn\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134763 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134678 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134763 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134745 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134893 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134778 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.134893 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134814 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.135007 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-web-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.135007 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134977 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.135007 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.134979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.137380 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137351 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-out\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.137523 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137501 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.137583 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137501 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.137632 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137586 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.137830 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137790 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.138007 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.137979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.138263 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.138238 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.138350 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.138304 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab29b6f6-6448-4c20-9469-9d2a350d7547-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.138390 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.138350 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.138429 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.138391 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.139528 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.139510 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab29b6f6-6448-4c20-9469-9d2a350d7547-web-config\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.143230 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.143214 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsvn\" (UniqueName: \"kubernetes.io/projected/ab29b6f6-6448-4c20-9469-9d2a350d7547-kube-api-access-gwsvn\") pod \"alertmanager-main-0\" (UID: \"ab29b6f6-6448-4c20-9469-9d2a350d7547\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.290476 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.290387 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:58:19.417047 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.417019 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:58:19.418854 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:19.418827 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab29b6f6_6448_4c20_9469_9d2a350d7547.slice/crio-92376a820fca21c178d231127ec99f4a8b3789b9a1f4521257610b5821c096b4 WatchSource:0}: Error finding container 92376a820fca21c178d231127ec99f4a8b3789b9a1f4521257610b5821c096b4: Status 404 returned error can't find the container with id 92376a820fca21c178d231127ec99f4a8b3789b9a1f4521257610b5821c096b4 Apr 22 17:58:19.841168 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.841118 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:58:19.841375 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.841274 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:19.843455 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.843430 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b38c080-af4d-4d73-ad1f-c364849e7212-metrics-tls\") pod \"dns-default-sgmcw\" (UID: \"4b38c080-af4d-4d73-ad1f-c364849e7212\") " pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:19.843545 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.843477 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/da61e571-00a2-4ad8-86ad-1156286a7409-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mhbld\" (UID: \"da61e571-00a2-4ad8-86ad-1156286a7409\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:58:19.909136 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.909110 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-q68l9\"" Apr 22 17:58:19.909792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.909779 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:58:19.916750 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.916727 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" Apr 22 17:58:19.917325 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.917307 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:19.920780 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.920753 2564 generic.go:358] "Generic (PLEG): container finished" podID="ab29b6f6-6448-4c20-9469-9d2a350d7547" containerID="5d9b123e9ba0c5ce83b9b068edf971aecca561dc5cd33f64dc3ff7c126561dab" exitCode=0 Apr 22 17:58:19.920908 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.920833 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerDied","Data":"5d9b123e9ba0c5ce83b9b068edf971aecca561dc5cd33f64dc3ff7c126561dab"} Apr 22 17:58:19.920908 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.920887 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"92376a820fca21c178d231127ec99f4a8b3789b9a1f4521257610b5821c096b4"} Apr 22 17:58:19.942657 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.942625 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:58:19.946536 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:19.946499 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/341b2cf5-e5b2-4950-98bb-f85daf6a0a5f-cert\") pod \"ingress-canary-bgpms\" (UID: \"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f\") " pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:58:20.058365 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.058338 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mhbld"] Apr 22 17:58:20.063924 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:20.063896 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda61e571_00a2_4ad8_86ad_1156286a7409.slice/crio-28f64e8475d04b5dce0d844a9217b584defd1188eedbe705f88d661ae55ba231 WatchSource:0}: Error finding container 28f64e8475d04b5dce0d844a9217b584defd1188eedbe705f88d661ae55ba231: Status 404 returned error can't find the container with id 28f64e8475d04b5dce0d844a9217b584defd1188eedbe705f88d661ae55ba231 Apr 22 17:58:20.082272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.079731 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sgmcw"] Apr 22 17:58:20.082938 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:20.082911 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b38c080_af4d_4d73_ad1f_c364849e7212.slice/crio-1d5d214f39852378230f4cbf71f0bb671094561d11ec0951fd8b03adf9376640 WatchSource:0}: Error finding container 1d5d214f39852378230f4cbf71f0bb671094561d11ec0951fd8b03adf9376640: Status 404 returned error can't find the container with id 1d5d214f39852378230f4cbf71f0bb671094561d11ec0951fd8b03adf9376640 Apr 22 17:58:20.156277 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.156251 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:58:20.164696 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.164666 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bgpms" Apr 22 17:58:20.307833 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.307790 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bgpms"] Apr 22 17:58:20.312188 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:20.312155 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341b2cf5_e5b2_4950_98bb_f85daf6a0a5f.slice/crio-2c2aa2146d697359bc92898b19d0a1fceace65e3fb55701861ee759a61a1d59c WatchSource:0}: Error finding container 2c2aa2146d697359bc92898b19d0a1fceace65e3fb55701861ee759a61a1d59c: Status 404 returned error can't find the container with id 2c2aa2146d697359bc92898b19d0a1fceace65e3fb55701861ee759a61a1d59c Apr 22 17:58:20.928231 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.928190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgmcw" event={"ID":"4b38c080-af4d-4d73-ad1f-c364849e7212","Type":"ContainerStarted","Data":"1d5d214f39852378230f4cbf71f0bb671094561d11ec0951fd8b03adf9376640"} Apr 22 17:58:20.932622 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932591 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"5f0fb3c511825026a28c4924e87e8ac83dfac2be722a882db87cdb47a1777132"} Apr 22 17:58:20.932766 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932633 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"cea37f6c084282e8c9b2931de476a3edda0fd48212e73fc99c0e22b5c059bf12"} Apr 22 17:58:20.932766 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932647 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"c81c9c4b7a79921a0e1859f70122060df3965c0b4daf3936a87e513a9fb1d912"} Apr 22 17:58:20.932766 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932659 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"85faf6c42bea0bdcbbf2736425c3081ba0ae346429652389145224d5259b3576"} Apr 22 17:58:20.932766 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932671 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"85e642f6a4e6231db0e17656d46c25d0f922406f7ad4ca503e73bf3dc5e7dd87"} Apr 22 17:58:20.932766 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.932685 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab29b6f6-6448-4c20-9469-9d2a350d7547","Type":"ContainerStarted","Data":"bb86ed666dc0e9b54c6bca24245182bcbd24144c57e898e1f6910ebf99f46300"} Apr 22 17:58:20.934741 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.934704 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bgpms" event={"ID":"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f","Type":"ContainerStarted","Data":"2c2aa2146d697359bc92898b19d0a1fceace65e3fb55701861ee759a61a1d59c"} Apr 22 17:58:20.936823 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.936780 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" event={"ID":"da61e571-00a2-4ad8-86ad-1156286a7409","Type":"ContainerStarted","Data":"28f64e8475d04b5dce0d844a9217b584defd1188eedbe705f88d661ae55ba231"} Apr 22 17:58:20.961790 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:20.961732 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.961714492 podStartE2EDuration="2.961714492s" podCreationTimestamp="2026-04-22 17:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:58:20.96063304 +0000 UTC m=+282.439156474" watchObservedRunningTime="2026-04-22 17:58:20.961714492 +0000 UTC m=+282.440237915" Apr 22 17:58:21.071341 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.071293 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6554b8784f-ll4hp"] Apr 22 17:58:21.075264 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.075242 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.080162 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.079884 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:58:21.080162 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.079903 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:58:21.080162 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.079941 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:58:21.080162 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.080045 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-knlh9\"" Apr 22 17:58:21.080441 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.080274 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:58:21.080441 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.080367 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:58:21.088924 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.088723 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:58:21.091875 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.091832 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6554b8784f-ll4hp"] Apr 22 17:58:21.155159 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155306 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155182 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155306 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155212 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155306 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155241 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155306 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155279 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-serving-certs-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155613 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155325 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-federate-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155613 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-metrics-client-ca\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.155613 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.155382 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdm5\" (UniqueName: \"kubernetes.io/projected/a8057716-e97e-4720-8b6d-c8da14dbf284-kube-api-access-dqdm5\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.256968 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.256887 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-federate-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.256968 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.256948 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-metrics-client-ca\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257175 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257019 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdm5\" (UniqueName: \"kubernetes.io/projected/a8057716-e97e-4720-8b6d-c8da14dbf284-kube-api-access-dqdm5\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257175 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257175 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257162 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257188 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.257321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-serving-certs-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.258160 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.257812 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-metrics-client-ca\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.258160 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.258012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-serving-certs-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.258405 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.258355 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.260853 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.260812 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-telemeter-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.260853 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.260833 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.261429 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.261410 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-secret-telemeter-client\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.261637 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.261604 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a8057716-e97e-4720-8b6d-c8da14dbf284-federate-client-tls\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.269486 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.269463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdm5\" (UniqueName: \"kubernetes.io/projected/a8057716-e97e-4720-8b6d-c8da14dbf284-kube-api-access-dqdm5\") pod \"telemeter-client-6554b8784f-ll4hp\" (UID: \"a8057716-e97e-4720-8b6d-c8da14dbf284\") " pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.351459 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.351403 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:21.352003 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.351966 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="prometheus" containerID="cri-o://ab6f4378d073fe3762db588cfb57af06b9038d87fd3ce04fbff034f26fcbd125" gracePeriod=600 Apr 22 17:58:21.352003 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.351974 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy" containerID="cri-o://4756a4f4cfee97ca7f0824b4045208a5278e3351f7245c85062795124e0c7959" gracePeriod=600 Apr 22 17:58:21.352206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.351976 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="thanos-sidecar" containerID="cri-o://96900e6fd2f3c2a97b932264172b07e85df0ea0c73e24044aab76b661121615f" gracePeriod=600 Apr 22 17:58:21.352206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.352115 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="config-reloader" containerID="cri-o://b487639879de08c486908e78bc3154289831ca1a61a18bb81eda78476abd6df3" gracePeriod=600 Apr 22 17:58:21.352206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.352128 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-thanos" containerID="cri-o://8398240f08fcd471760f8bdbb422d057959e0b48fd161fcebafc4409694b1ce1" gracePeriod=600 Apr 22 17:58:21.352206 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.352171 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-web" containerID="cri-o://6fa81960161a48cc8303b604f6232707ea9981dab2a3225c27962788f4895ac1" gracePeriod=600 Apr 22 17:58:21.392435 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.392407 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945380 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="8398240f08fcd471760f8bdbb422d057959e0b48fd161fcebafc4409694b1ce1" exitCode=0 Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945414 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="4756a4f4cfee97ca7f0824b4045208a5278e3351f7245c85062795124e0c7959" exitCode=0 Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945426 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="6fa81960161a48cc8303b604f6232707ea9981dab2a3225c27962788f4895ac1" exitCode=0 Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945434 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="96900e6fd2f3c2a97b932264172b07e85df0ea0c73e24044aab76b661121615f" exitCode=0 Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945443 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="b487639879de08c486908e78bc3154289831ca1a61a18bb81eda78476abd6df3" exitCode=0 Apr 22 17:58:21.945464 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945451 2564 generic.go:358] "Generic (PLEG): container finished" podID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerID="ab6f4378d073fe3762db588cfb57af06b9038d87fd3ce04fbff034f26fcbd125" exitCode=0 Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945572 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"8398240f08fcd471760f8bdbb422d057959e0b48fd161fcebafc4409694b1ce1"} Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945603 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"4756a4f4cfee97ca7f0824b4045208a5278e3351f7245c85062795124e0c7959"} Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945619 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"6fa81960161a48cc8303b604f6232707ea9981dab2a3225c27962788f4895ac1"} Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945632 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"96900e6fd2f3c2a97b932264172b07e85df0ea0c73e24044aab76b661121615f"} Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945648 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"b487639879de08c486908e78bc3154289831ca1a61a18bb81eda78476abd6df3"} Apr 22 17:58:21.946170 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.945664 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"ab6f4378d073fe3762db588cfb57af06b9038d87fd3ce04fbff034f26fcbd125"} Apr 22 17:58:21.947303 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.947271 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" event={"ID":"da61e571-00a2-4ad8-86ad-1156286a7409","Type":"ContainerStarted","Data":"335fb1f580f7a3b9cde66078baaaac6cde04f80b0d768600defd082eac0f839d"} Apr 22 17:58:21.964198 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:21.963486 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mhbld" podStartSLOduration=259.954033946 podStartE2EDuration="4m20.963469162s" podCreationTimestamp="2026-04-22 17:54:01 +0000 UTC" firstStartedPulling="2026-04-22 17:58:20.065897544 +0000 UTC m=+281.544420948" lastFinishedPulling="2026-04-22 17:58:21.075332746 +0000 UTC m=+282.553856164" observedRunningTime="2026-04-22 17:58:21.962779886 +0000 UTC m=+283.441303311" watchObservedRunningTime="2026-04-22 17:58:21.963469162 +0000 UTC m=+283.441992582" Apr 22 17:58:22.200329 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.200224 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:22.266172 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266141 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266192 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266321 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266224 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266480 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266458 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266550 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266515 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266604 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266545 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266604 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266572 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzxl8\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.266699 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.266625 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.267241 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.267211 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:22.269002 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.268270 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:22.269700 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.269675 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.269792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.269746 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.269792 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.269773 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.269806 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.269835 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270072 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270116 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270143 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270189 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270221 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls\") pod \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\" (UID: \"6ce9887d-e613-4031-b32d-7e8e61da9ca7\") " Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270501 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270503 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:22.270819 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.270520 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-k8s-db\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.271337 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.271188 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.271337 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.271197 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out" (OuterVolumeSpecName: "config-out") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:58:22.271337 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.271282 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.271850 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.271819 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.274232 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.274193 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config" (OuterVolumeSpecName: "config") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.274538 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.274501 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:22.274674 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.273799 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:22.275008 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.274829 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:22.275688 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.275649 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8" (OuterVolumeSpecName: "kube-api-access-wzxl8") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "kube-api-access-wzxl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:22.275913 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.275850 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.279399 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.279309 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.279844 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.279786 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:22.280326 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.280260 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.280943 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.280919 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.300917 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.300882 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config" (OuterVolumeSpecName: "web-config") pod "6ce9887d-e613-4031-b32d-7e8e61da9ca7" (UID: "6ce9887d-e613-4031-b32d-7e8e61da9ca7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:22.339297 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.339205 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6554b8784f-ll4hp"] Apr 22 17:58:22.340724 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:22.340699 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8057716_e97e_4720_8b6d_c8da14dbf284.slice/crio-6d356a3554ec701e2df6ba83113ac6b48792d6280d3f3928403454d7a35df8aa WatchSource:0}: Error finding container 6d356a3554ec701e2df6ba83113ac6b48792d6280d3f3928403454d7a35df8aa: Status 404 returned error can't find the container with id 6d356a3554ec701e2df6ba83113ac6b48792d6280d3f3928403454d7a35df8aa Apr 22 17:58:22.371669 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371636 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371669 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371673 2564 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config-out\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371691 2564 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-kube-rbac-proxy\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371708 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371725 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzxl8\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-kube-api-access-wzxl8\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371740 2564 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371755 2564 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371769 2564 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-config\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371786 2564 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-metrics-client-certs\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.371812 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371800 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-metrics-client-ca\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371814 2564 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-web-config\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371829 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371844 2564 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371882 2564 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ce9887d-e613-4031-b32d-7e8e61da9ca7-tls-assets\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371897 2564 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce9887d-e613-4031-b32d-7e8e61da9ca7-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.372122 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.371910 2564 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ce9887d-e613-4031-b32d-7e8e61da9ca7-secret-grpc-tls\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 17:58:22.955364 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.955325 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ce9887d-e613-4031-b32d-7e8e61da9ca7","Type":"ContainerDied","Data":"b0f3be24ec347d64ce811cfe1c4e0bf08270a2317497c5d295e3372d463e7929"} Apr 22 17:58:22.955364 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.955383 2564 scope.go:117] "RemoveContainer" containerID="8398240f08fcd471760f8bdbb422d057959e0b48fd161fcebafc4409694b1ce1" Apr 22 17:58:22.955959 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.955412 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:22.957290 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.957250 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bgpms" event={"ID":"341b2cf5-e5b2-4950-98bb-f85daf6a0a5f","Type":"ContainerStarted","Data":"c7f9b9d69820373796938734e0f5d90973d74394b6736f5d17b03ab0172a059b"} Apr 22 17:58:22.959231 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.959204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgmcw" event={"ID":"4b38c080-af4d-4d73-ad1f-c364849e7212","Type":"ContainerStarted","Data":"9b2f7de83e8913f83bc71dc4040f5db69b5dfd9c149f9966486788b07d788443"} Apr 22 17:58:22.959363 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.959238 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sgmcw" event={"ID":"4b38c080-af4d-4d73-ad1f-c364849e7212","Type":"ContainerStarted","Data":"6eaf1686c8c296c88b9de16d7ac76e90bbac8824dbc651d27ad005ae1940e441"} Apr 22 17:58:22.959363 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.959330 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:22.961184 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.961146 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" event={"ID":"a8057716-e97e-4720-8b6d-c8da14dbf284","Type":"ContainerStarted","Data":"6d356a3554ec701e2df6ba83113ac6b48792d6280d3f3928403454d7a35df8aa"} Apr 22 17:58:22.965592 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.965482 2564 scope.go:117] "RemoveContainer" containerID="4756a4f4cfee97ca7f0824b4045208a5278e3351f7245c85062795124e0c7959" Apr 22 17:58:22.973545 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.973523 2564 scope.go:117] "RemoveContainer" containerID="6fa81960161a48cc8303b604f6232707ea9981dab2a3225c27962788f4895ac1" Apr 22 17:58:22.980240 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.980094 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sgmcw" podStartSLOduration=251.87892835 podStartE2EDuration="4m13.980079115s" podCreationTimestamp="2026-04-22 17:54:09 +0000 UTC" firstStartedPulling="2026-04-22 17:58:20.086060512 +0000 UTC m=+281.564583916" lastFinishedPulling="2026-04-22 17:58:22.187211267 +0000 UTC m=+283.665734681" observedRunningTime="2026-04-22 17:58:22.978807822 +0000 UTC m=+284.457331247" watchObservedRunningTime="2026-04-22 17:58:22.980079115 +0000 UTC m=+284.458602542" Apr 22 17:58:22.981616 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.981596 2564 scope.go:117] "RemoveContainer" containerID="96900e6fd2f3c2a97b932264172b07e85df0ea0c73e24044aab76b661121615f" Apr 22 17:58:22.989312 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.989291 2564 scope.go:117] "RemoveContainer" containerID="b487639879de08c486908e78bc3154289831ca1a61a18bb81eda78476abd6df3" Apr 22 17:58:22.995047 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.995000 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bgpms" podStartSLOduration=252.118906444 podStartE2EDuration="4m13.994985968s" podCreationTimestamp="2026-04-22 17:54:09 +0000 UTC" firstStartedPulling="2026-04-22 17:58:20.314172314 +0000 UTC m=+281.792695715" lastFinishedPulling="2026-04-22 17:58:22.190251823 +0000 UTC m=+283.668775239" observedRunningTime="2026-04-22 17:58:22.993982162 +0000 UTC m=+284.472505587" watchObservedRunningTime="2026-04-22 17:58:22.994985968 +0000 UTC m=+284.473509393" Apr 22 17:58:22.996765 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:22.996750 2564 scope.go:117] "RemoveContainer" containerID="ab6f4378d073fe3762db588cfb57af06b9038d87fd3ce04fbff034f26fcbd125" Apr 22 17:58:23.006671 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.006644 2564 scope.go:117] "RemoveContainer" containerID="da369aa023f315276ae065533960ff7c03db04d28284559e1729ff6fac441ad0" Apr 22 17:58:23.009979 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.009956 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:23.014187 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.014157 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:23.039910 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.039887 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:23.040232 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040217 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="thanos-sidecar" Apr 22 17:58:23.040284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040235 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="thanos-sidecar" Apr 22 17:58:23.040284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040248 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="init-config-reloader" Apr 22 17:58:23.040284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040254 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="init-config-reloader" Apr 22 17:58:23.040284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040273 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-web" Apr 22 17:58:23.040284 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040280 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-web" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040290 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-thanos" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040296 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-thanos" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040307 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040313 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040325 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="config-reloader" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040330 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="config-reloader" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040343 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="prometheus" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040350 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="prometheus" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040409 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="thanos-sidecar" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040420 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-web" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040427 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="prometheus" Apr 22 17:58:23.040436 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040434 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy-thanos" Apr 22 17:58:23.040847 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040441 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="kube-rbac-proxy" Apr 22 17:58:23.040847 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.040451 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" containerName="config-reloader" Apr 22 17:58:23.045484 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.045467 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.048608 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.048279 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hhcvh\"" Apr 22 17:58:23.048608 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.048309 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:58:23.048608 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.048456 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:58:23.048993 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.048972 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049206 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049228 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049258 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049213 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049299 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049300 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:58:23.049385 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049373 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fmjgopp7k07jq\"" Apr 22 17:58:23.049709 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.049377 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:58:23.062167 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.061636 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:58:23.062167 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.061704 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:58:23.062743 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.062525 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:58:23.066103 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.066082 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce9887d-e613-4031-b32d-7e8e61da9ca7" path="/var/lib/kubelet/pods/6ce9887d-e613-4031-b32d-7e8e61da9ca7/volumes" Apr 22 17:58:23.066641 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.066626 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:23.078573 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.078544 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.078768 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.078744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.078897 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.078843 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.078971 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.078930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079026 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.078973 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079130 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079026 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079130 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079050 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl82g\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-kube-api-access-kl82g\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079130 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079107 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079133 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079164 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079188 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079272 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079244 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079439 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079315 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config-out\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079439 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079359 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079499 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079442 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-web-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079499 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079469 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.079575 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.079493 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.180717 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.180684 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config-out\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.180912 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.180732 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.180912 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.180893 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-web-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181045 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.180947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181045 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.180978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181045 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181192 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181062 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181192 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181092 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181192 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181192 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181237 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl82g\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-kube-api-access-kl82g\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181298 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181312 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181328 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181382 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181350 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181714 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181383 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.181714 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.181408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.182387 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.182038 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.183813 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.183784 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config-out\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.184009 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.183988 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.184093 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.184012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.184093 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.184077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-web-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.184476 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-config\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.184640 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.184752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.185027 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185282 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.185251 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185567 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.185496 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.185567 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.185503 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.186422 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.186395 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.186422 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.186408 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.186689 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.186668 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.188155 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.188133 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.191943 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.191924 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl82g\" (UniqueName: \"kubernetes.io/projected/22d4e49c-c889-4f0f-abbe-9dcb85ef4296-kube-api-access-kl82g\") pod \"prometheus-k8s-0\" (UID: \"22d4e49c-c889-4f0f-abbe-9dcb85ef4296\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.362493 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.362456 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:23.500803 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.500779 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:58:23.502693 ip-10-0-143-11 kubenswrapper[2564]: W0422 17:58:23.502660 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d4e49c_c889_4f0f_abbe_9dcb85ef4296.slice/crio-3c78cc061c82240ce78ffe49b503f9170b580d50d69d8bcf7d8be46585c63ea6 WatchSource:0}: Error finding container 3c78cc061c82240ce78ffe49b503f9170b580d50d69d8bcf7d8be46585c63ea6: Status 404 returned error can't find the container with id 3c78cc061c82240ce78ffe49b503f9170b580d50d69d8bcf7d8be46585c63ea6 Apr 22 17:58:23.968303 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.968258 2564 generic.go:358] "Generic (PLEG): container finished" podID="22d4e49c-c889-4f0f-abbe-9dcb85ef4296" containerID="e2a07987b9e2154817bf03261cb6f62f20990a4e0b7b7e8c3dc481ce66c45b88" exitCode=0 Apr 22 17:58:23.968716 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.968305 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerDied","Data":"e2a07987b9e2154817bf03261cb6f62f20990a4e0b7b7e8c3dc481ce66c45b88"} Apr 22 17:58:23.968716 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:23.968352 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"3c78cc061c82240ce78ffe49b503f9170b580d50d69d8bcf7d8be46585c63ea6"} Apr 22 17:58:24.978607 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978561 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"1bfa1ec9a2b023e73c650279a5e21d5acb0226e49b067771da16d4ea1691862d"} Apr 22 17:58:24.978607 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978611 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"fafeec2337da1a0ba8ab4bcf8530e4cf88dbc33951bc8f5e8d9277fc14afa935"} Apr 22 17:58:24.979158 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978624 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"1baed4b5a73dc495731d813d84dfbd8b0142bd9b543d3c0ec7ac27a6c82a32f0"} Apr 22 17:58:24.979158 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978637 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"a96a0b2254081335a7235e22c68035b427fe42bf2e4cab3cf377db170521e24c"} Apr 22 17:58:24.979158 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978649 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"de97cd83908edaeb5646c159cb88c471463021ce28faea4e378a360201f48833"} Apr 22 17:58:24.979158 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.978662 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"22d4e49c-c889-4f0f-abbe-9dcb85ef4296","Type":"ContainerStarted","Data":"9f9473e52ff22672da0f69a7ad5117ef401fdaa5800fb11dd67e1fb674f17fce"} Apr 22 17:58:24.980373 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.980349 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" event={"ID":"a8057716-e97e-4720-8b6d-c8da14dbf284","Type":"ContainerStarted","Data":"7ac416d355d6405d4a6ed40037ae0d5a318384b6aa10a11272d0c79eff8354fd"} Apr 22 17:58:24.980373 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.980376 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" event={"ID":"a8057716-e97e-4720-8b6d-c8da14dbf284","Type":"ContainerStarted","Data":"1a083ac269ec36e3509f015d366108c65a840585d15760dece086d89f90523c9"} Apr 22 17:58:24.980537 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:24.980385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" event={"ID":"a8057716-e97e-4720-8b6d-c8da14dbf284","Type":"ContainerStarted","Data":"bcec990200c0f0e17efe3bdeb74b798b80d91d052e8f340072e8ce0b4fd31052"} Apr 22 17:58:25.006512 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:25.006457 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.006440222 podStartE2EDuration="2.006440222s" podCreationTimestamp="2026-04-22 17:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:58:25.005186831 +0000 UTC m=+286.483710256" watchObservedRunningTime="2026-04-22 17:58:25.006440222 +0000 UTC m=+286.484963677" Apr 22 17:58:25.026645 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:25.026600 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6554b8784f-ll4hp" podStartSLOduration=2.215747305 podStartE2EDuration="4.026586861s" podCreationTimestamp="2026-04-22 17:58:21 +0000 UTC" firstStartedPulling="2026-04-22 17:58:22.342686587 +0000 UTC m=+283.821209988" lastFinishedPulling="2026-04-22 17:58:24.153526144 +0000 UTC m=+285.632049544" observedRunningTime="2026-04-22 17:58:25.024830145 +0000 UTC m=+286.503353570" watchObservedRunningTime="2026-04-22 17:58:25.026586861 +0000 UTC m=+286.505110283" Apr 22 17:58:28.363198 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:28.363163 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:32.971195 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:32.971163 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sgmcw" Apr 22 17:58:38.990837 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:38.990815 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:58:38.991221 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:38.990815 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 17:58:38.993390 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:58:38.993373 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:59:23.362791 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:59:23.362700 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:59:23.378508 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:59:23.378477 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:59:24.170786 ip-10-0-143-11 kubenswrapper[2564]: I0422 17:59:24.170749 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:03:39.016940 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:39.016909 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:03:39.017781 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:39.017721 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:03:53.566887 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.566786 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-kl7mz"] Apr 22 18:03:53.570152 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.570134 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kl7mz" Apr 22 18:03:53.572968 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.572935 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:03:53.573957 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.573924 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:03:53.574091 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.573987 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-w99hj\"" Apr 22 18:03:53.574091 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.574015 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:03:53.574721 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.574698 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kl7mz"] Apr 22 18:03:53.666425 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.666381 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j4m2\" (UniqueName: \"kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2\") pod \"s3-init-kl7mz\" (UID: \"952446a9-3fb7-4b71-a583-1cc37945cedc\") " pod="kserve/s3-init-kl7mz" Apr 22 18:03:53.767463 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.767427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j4m2\" (UniqueName: \"kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2\") pod \"s3-init-kl7mz\" (UID: \"952446a9-3fb7-4b71-a583-1cc37945cedc\") " pod="kserve/s3-init-kl7mz" Apr 22 18:03:53.776142 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.776107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j4m2\" (UniqueName: \"kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2\") pod \"s3-init-kl7mz\" (UID: \"952446a9-3fb7-4b71-a583-1cc37945cedc\") " pod="kserve/s3-init-kl7mz" Apr 22 18:03:53.889278 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:53.889246 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kl7mz" Apr 22 18:03:54.012463 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:54.012382 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kl7mz"] Apr 22 18:03:54.015523 ip-10-0-143-11 kubenswrapper[2564]: W0422 18:03:54.015497 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod952446a9_3fb7_4b71_a583_1cc37945cedc.slice/crio-0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e WatchSource:0}: Error finding container 0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e: Status 404 returned error can't find the container with id 0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e Apr 22 18:03:54.017645 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:54.017625 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:03:54.962846 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:54.962805 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kl7mz" event={"ID":"952446a9-3fb7-4b71-a583-1cc37945cedc","Type":"ContainerStarted","Data":"0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e"} Apr 22 18:03:58.978363 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:58.978332 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kl7mz" event={"ID":"952446a9-3fb7-4b71-a583-1cc37945cedc","Type":"ContainerStarted","Data":"0ba3d33441e8af26611bc9cd0950e24d07f7b00126183425a3ca3ef94848a0ec"} Apr 22 18:03:58.994351 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:03:58.994305 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-kl7mz" podStartSLOduration=1.5044056970000002 podStartE2EDuration="5.99428991s" podCreationTimestamp="2026-04-22 18:03:53 +0000 UTC" firstStartedPulling="2026-04-22 18:03:54.01778628 +0000 UTC m=+615.496309682" lastFinishedPulling="2026-04-22 18:03:58.507670494 +0000 UTC m=+619.986193895" observedRunningTime="2026-04-22 18:03:58.993447781 +0000 UTC m=+620.471971205" watchObservedRunningTime="2026-04-22 18:03:58.99428991 +0000 UTC m=+620.472813332" Apr 22 18:04:01.989211 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:01.989175 2564 generic.go:358] "Generic (PLEG): container finished" podID="952446a9-3fb7-4b71-a583-1cc37945cedc" containerID="0ba3d33441e8af26611bc9cd0950e24d07f7b00126183425a3ca3ef94848a0ec" exitCode=0 Apr 22 18:04:01.989567 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:01.989247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kl7mz" event={"ID":"952446a9-3fb7-4b71-a583-1cc37945cedc","Type":"ContainerDied","Data":"0ba3d33441e8af26611bc9cd0950e24d07f7b00126183425a3ca3ef94848a0ec"} Apr 22 18:04:03.122430 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.122406 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kl7mz" Apr 22 18:04:03.256086 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.256003 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j4m2\" (UniqueName: \"kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2\") pod \"952446a9-3fb7-4b71-a583-1cc37945cedc\" (UID: \"952446a9-3fb7-4b71-a583-1cc37945cedc\") " Apr 22 18:04:03.258094 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.258062 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2" (OuterVolumeSpecName: "kube-api-access-2j4m2") pod "952446a9-3fb7-4b71-a583-1cc37945cedc" (UID: "952446a9-3fb7-4b71-a583-1cc37945cedc"). InnerVolumeSpecName "kube-api-access-2j4m2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:03.357440 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.357391 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2j4m2\" (UniqueName: \"kubernetes.io/projected/952446a9-3fb7-4b71-a583-1cc37945cedc-kube-api-access-2j4m2\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 18:04:03.995642 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.995606 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kl7mz" event={"ID":"952446a9-3fb7-4b71-a583-1cc37945cedc","Type":"ContainerDied","Data":"0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e"} Apr 22 18:04:03.995642 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.995635 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kl7mz" Apr 22 18:04:03.995876 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:04:03.995642 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e3b8ec117d861efa651a3d8ef7e50d1fb8ae2a081d93bd997c018992847ee6e" Apr 22 18:08:39.039526 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:08:39.039445 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:08:39.042195 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:08:39.042176 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:13:39.070580 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:13:39.070510 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:13:39.073334 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:13:39.073316 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:18:39.095451 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:18:39.095423 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:18:39.097787 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:18:39.097380 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:23:39.119234 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:23:39.119124 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:23:39.123166 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:23:39.122014 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:28:39.142231 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:28:39.142127 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:28:39.145008 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:28:39.144991 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:33:39.166302 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:33:39.166212 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:33:39.170214 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:33:39.170193 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:38:39.190068 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:38:39.189932 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:38:39.193929 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:38:39.193912 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:43:39.213634 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:43:39.213529 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:43:39.218210 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:43:39.218190 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:48:39.238332 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:48:39.238203 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:48:39.244326 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:48:39.244302 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:53:39.261331 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:53:39.261210 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:53:39.269063 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:53:39.269044 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:58:39.294817 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:58:39.294703 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 18:58:39.300248 ip-10-0-143-11 kubenswrapper[2564]: I0422 18:58:39.300227 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 19:03:39.318456 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:03:39.318345 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 19:03:39.324842 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:03:39.324820 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 19:04:17.029409 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.029366 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vh2hj/must-gather-dddfl"] Apr 22 19:04:17.030208 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.030182 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="952446a9-3fb7-4b71-a583-1cc37945cedc" containerName="s3-init" Apr 22 19:04:17.030320 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.030211 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="952446a9-3fb7-4b71-a583-1cc37945cedc" containerName="s3-init" Apr 22 19:04:17.030392 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.030377 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="952446a9-3fb7-4b71-a583-1cc37945cedc" containerName="s3-init" Apr 22 19:04:17.034464 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.034438 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.037259 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.037224 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vh2hj\"/\"openshift-service-ca.crt\"" Apr 22 19:04:17.037409 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.037243 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vh2hj\"/\"kube-root-ca.crt\"" Apr 22 19:04:17.057451 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.057422 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vh2hj/must-gather-dddfl"] Apr 22 19:04:17.128848 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.128817 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfnw\" (UniqueName: \"kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.128848 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.128856 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.229732 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.229705 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfnw\" (UniqueName: \"kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.229732 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.229735 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.230095 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.230077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.238093 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.238072 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfnw\" (UniqueName: \"kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw\") pod \"must-gather-dddfl\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.354846 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.354815 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:17.475939 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.475808 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vh2hj/must-gather-dddfl"] Apr 22 19:04:17.478226 ip-10-0-143-11 kubenswrapper[2564]: W0422 19:04:17.478192 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b30f5a_c776_4a5a_baec_cde78e2d8deb.slice/crio-9a2f1b96a54ca5ab47d642e47cb194e8c1d3fe8d9d08bfc73e80e6716a610fd9 WatchSource:0}: Error finding container 9a2f1b96a54ca5ab47d642e47cb194e8c1d3fe8d9d08bfc73e80e6716a610fd9: Status 404 returned error can't find the container with id 9a2f1b96a54ca5ab47d642e47cb194e8c1d3fe8d9d08bfc73e80e6716a610fd9 Apr 22 19:04:17.479953 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.479936 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:04:17.840007 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:17.839975 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vh2hj/must-gather-dddfl" event={"ID":"91b30f5a-c776-4a5a-baec-cde78e2d8deb","Type":"ContainerStarted","Data":"9a2f1b96a54ca5ab47d642e47cb194e8c1d3fe8d9d08bfc73e80e6716a610fd9"} Apr 22 19:04:23.867184 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:23.867142 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vh2hj/must-gather-dddfl" event={"ID":"91b30f5a-c776-4a5a-baec-cde78e2d8deb","Type":"ContainerStarted","Data":"da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae"} Apr 22 19:04:23.867577 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:23.867193 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vh2hj/must-gather-dddfl" event={"ID":"91b30f5a-c776-4a5a-baec-cde78e2d8deb","Type":"ContainerStarted","Data":"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959"} Apr 22 19:04:23.885362 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:23.885313 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vh2hj/must-gather-dddfl" podStartSLOduration=0.983526906 podStartE2EDuration="6.885299953s" podCreationTimestamp="2026-04-22 19:04:17 +0000 UTC" firstStartedPulling="2026-04-22 19:04:17.480101021 +0000 UTC m=+4238.958624425" lastFinishedPulling="2026-04-22 19:04:23.381874058 +0000 UTC m=+4244.860397472" observedRunningTime="2026-04-22 19:04:23.88373383 +0000 UTC m=+4245.362257254" watchObservedRunningTime="2026-04-22 19:04:23.885299953 +0000 UTC m=+4245.363823376" Apr 22 19:04:48.953268 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:48.953231 2564 generic.go:358] "Generic (PLEG): container finished" podID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerID="6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959" exitCode=0 Apr 22 19:04:48.953673 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:48.953294 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vh2hj/must-gather-dddfl" event={"ID":"91b30f5a-c776-4a5a-baec-cde78e2d8deb","Type":"ContainerDied","Data":"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959"} Apr 22 19:04:48.953673 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:48.953634 2564 scope.go:117] "RemoveContainer" containerID="6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959" Apr 22 19:04:49.271583 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:49.271497 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vh2hj_must-gather-dddfl_91b30f5a-c776-4a5a-baec-cde78e2d8deb/gather/0.log" Apr 22 19:04:52.480538 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:52.480509 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5lc5s_6756cbc5-361a-4f47-b8c7-7fa5c6ce2bc7/global-pull-secret-syncer/0.log" Apr 22 19:04:52.653777 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:52.653737 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fgqcv_8a12c58d-7667-4b16-8b5b-9fb5d4f10530/konnectivity-agent/0.log" Apr 22 19:04:52.786213 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:52.786126 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-11.ec2.internal_fdf6ed0045bf81395f30896fa82f74ae/haproxy/0.log" Apr 22 19:04:54.666121 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.666082 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vh2hj/must-gather-dddfl"] Apr 22 19:04:54.666720 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.666517 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vh2hj/must-gather-dddfl" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="copy" containerID="cri-o://da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae" gracePeriod=2 Apr 22 19:04:54.668293 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.668270 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vh2hj/must-gather-dddfl"] Apr 22 19:04:54.668516 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.668494 2564 status_manager.go:895] "Failed to get status for pod" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" pod="openshift-must-gather-vh2hj/must-gather-dddfl" err="pods \"must-gather-dddfl\" is forbidden: User \"system:node:ip-10-0-143-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vh2hj\": no relationship found between node 'ip-10-0-143-11.ec2.internal' and this object" Apr 22 19:04:54.902660 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.902638 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vh2hj_must-gather-dddfl_91b30f5a-c776-4a5a-baec-cde78e2d8deb/copy/0.log" Apr 22 19:04:54.902976 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.902962 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:54.905437 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.905413 2564 status_manager.go:895] "Failed to get status for pod" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" pod="openshift-must-gather-vh2hj/must-gather-dddfl" err="pods \"must-gather-dddfl\" is forbidden: User \"system:node:ip-10-0-143-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vh2hj\": no relationship found between node 'ip-10-0-143-11.ec2.internal' and this object" Apr 22 19:04:54.971652 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.971592 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vh2hj_must-gather-dddfl_91b30f5a-c776-4a5a-baec-cde78e2d8deb/copy/0.log" Apr 22 19:04:54.971936 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.971916 2564 generic.go:358] "Generic (PLEG): container finished" podID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerID="da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae" exitCode=143 Apr 22 19:04:54.972009 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.971972 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vh2hj/must-gather-dddfl" Apr 22 19:04:54.972009 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.971986 2564 scope.go:117] "RemoveContainer" containerID="da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae" Apr 22 19:04:54.974044 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.974012 2564 status_manager.go:895] "Failed to get status for pod" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" pod="openshift-must-gather-vh2hj/must-gather-dddfl" err="pods \"must-gather-dddfl\" is forbidden: User \"system:node:ip-10-0-143-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vh2hj\": no relationship found between node 'ip-10-0-143-11.ec2.internal' and this object" Apr 22 19:04:54.977268 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.977243 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output\") pod \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " Apr 22 19:04:54.977365 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.977330 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hfnw\" (UniqueName: \"kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw\") pod \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\" (UID: \"91b30f5a-c776-4a5a-baec-cde78e2d8deb\") " Apr 22 19:04:54.978597 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.978566 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "91b30f5a-c776-4a5a-baec-cde78e2d8deb" (UID: "91b30f5a-c776-4a5a-baec-cde78e2d8deb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:54.979330 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.979311 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw" (OuterVolumeSpecName: "kube-api-access-4hfnw") pod "91b30f5a-c776-4a5a-baec-cde78e2d8deb" (UID: "91b30f5a-c776-4a5a-baec-cde78e2d8deb"). InnerVolumeSpecName "kube-api-access-4hfnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:04:54.979737 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.979723 2564 scope.go:117] "RemoveContainer" containerID="6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959" Apr 22 19:04:54.993795 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.993775 2564 scope.go:117] "RemoveContainer" containerID="da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae" Apr 22 19:04:54.994165 ip-10-0-143-11 kubenswrapper[2564]: E0422 19:04:54.994137 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae\": container with ID starting with da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae not found: ID does not exist" containerID="da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae" Apr 22 19:04:54.994245 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.994173 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae"} err="failed to get container status \"da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae\": rpc error: code = NotFound desc = could not find container \"da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae\": container with ID starting with da37c49c4c3e46479092917a4bf3dc23c39adacaa5403664cfd246a64b9840ae not found: ID does not exist" Apr 22 19:04:54.994245 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.994193 2564 scope.go:117] "RemoveContainer" containerID="6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959" Apr 22 19:04:54.994469 ip-10-0-143-11 kubenswrapper[2564]: E0422 19:04:54.994450 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959\": container with ID starting with 6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959 not found: ID does not exist" containerID="6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959" Apr 22 19:04:54.994524 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:54.994475 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959"} err="failed to get container status \"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959\": rpc error: code = NotFound desc = could not find container \"6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959\": container with ID starting with 6abb5431a7483c18a366363b8ef5a7dcc09761b20cc45f130c03e7edc9f98959 not found: ID does not exist" Apr 22 19:04:55.057547 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.057517 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" path="/var/lib/kubelet/pods/91b30f5a-c776-4a5a-baec-cde78e2d8deb/volumes" Apr 22 19:04:55.078359 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.078335 2564 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b30f5a-c776-4a5a-baec-cde78e2d8deb-must-gather-output\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 19:04:55.078359 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.078357 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hfnw\" (UniqueName: \"kubernetes.io/projected/91b30f5a-c776-4a5a-baec-cde78e2d8deb-kube-api-access-4hfnw\") on node \"ip-10-0-143-11.ec2.internal\" DevicePath \"\"" Apr 22 19:04:55.907784 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.907754 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/alertmanager/0.log" Apr 22 19:04:55.934615 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.934594 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/config-reloader/0.log" Apr 22 19:04:55.958461 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.958435 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/kube-rbac-proxy-web/0.log" Apr 22 19:04:55.979623 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:55.979580 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/kube-rbac-proxy/0.log" Apr 22 19:04:56.005284 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.005258 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/kube-rbac-proxy-metric/0.log" Apr 22 19:04:56.031642 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.031623 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/prom-label-proxy/0.log" Apr 22 19:04:56.061242 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.061221 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab29b6f6-6448-4c20-9469-9d2a350d7547/init-config-reloader/0.log" Apr 22 19:04:56.103317 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.103282 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-h5pjc_bb31bacb-bc3d-4ab6-93ea-80941f907553/cluster-monitoring-operator/0.log" Apr 22 19:04:56.123906 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.123829 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvcw9_b7bb8ac7-2202-43de-96a7-145ad6c915e7/kube-state-metrics/0.log" Apr 22 19:04:56.145764 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.145749 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvcw9_b7bb8ac7-2202-43de-96a7-145ad6c915e7/kube-rbac-proxy-main/0.log" Apr 22 19:04:56.166201 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.166182 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvcw9_b7bb8ac7-2202-43de-96a7-145ad6c915e7/kube-rbac-proxy-self/0.log" Apr 22 19:04:56.251644 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.251619 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2znk9_6df46903-fedf-4d2c-b525-4b5f8730c544/node-exporter/0.log" Apr 22 19:04:56.273563 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.273538 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2znk9_6df46903-fedf-4d2c-b525-4b5f8730c544/kube-rbac-proxy/0.log" Apr 22 19:04:56.294361 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.294342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2znk9_6df46903-fedf-4d2c-b525-4b5f8730c544/init-textfile/0.log" Apr 22 19:04:56.580412 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.580384 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/prometheus/0.log" Apr 22 19:04:56.597599 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.597556 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/config-reloader/0.log" Apr 22 19:04:56.621194 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.621168 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/thanos-sidecar/0.log" Apr 22 19:04:56.641662 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.641642 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/kube-rbac-proxy-web/0.log" Apr 22 19:04:56.664617 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.664599 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/kube-rbac-proxy/0.log" Apr 22 19:04:56.687657 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.687628 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/kube-rbac-proxy-thanos/0.log" Apr 22 19:04:56.714828 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.714813 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_22d4e49c-c889-4f0f-abbe-9dcb85ef4296/init-config-reloader/0.log" Apr 22 19:04:56.822643 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.822579 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6554b8784f-ll4hp_a8057716-e97e-4720-8b6d-c8da14dbf284/telemeter-client/0.log" Apr 22 19:04:56.843460 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.843442 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6554b8784f-ll4hp_a8057716-e97e-4720-8b6d-c8da14dbf284/reload/0.log" Apr 22 19:04:56.870642 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.870623 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6554b8784f-ll4hp_a8057716-e97e-4720-8b6d-c8da14dbf284/kube-rbac-proxy/0.log" Apr 22 19:04:56.902998 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.902981 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/thanos-query/0.log" Apr 22 19:04:56.923295 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.923267 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/kube-rbac-proxy-web/0.log" Apr 22 19:04:56.951959 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.951943 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/kube-rbac-proxy/0.log" Apr 22 19:04:56.973941 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.973920 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/prom-label-proxy/0.log" Apr 22 19:04:56.994243 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:56.994224 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/kube-rbac-proxy-rules/0.log" Apr 22 19:04:57.020210 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:57.020188 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-775755d68c-n8jrt_5916f292-3d76-4f55-81bb-25b2585feda7/kube-rbac-proxy-metrics/0.log" Apr 22 19:04:58.212375 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:58.212343 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mhbld_da61e571-00a2-4ad8-86ad-1156286a7409/networking-console-plugin/0.log" Apr 22 19:04:59.727638 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.727603 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw"] Apr 22 19:04:59.728033 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.727968 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="gather" Apr 22 19:04:59.728033 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.727980 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="gather" Apr 22 19:04:59.728033 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.728000 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="copy" Apr 22 19:04:59.728033 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.728006 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="copy" Apr 22 19:04:59.728160 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.728060 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="copy" Apr 22 19:04:59.728160 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.728070 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="91b30f5a-c776-4a5a-baec-cde78e2d8deb" containerName="gather" Apr 22 19:04:59.733226 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.733202 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.736782 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.736761 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-862r5\"/\"default-dockercfg-dzdgq\"" Apr 22 19:04:59.736906 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.736779 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"kube-root-ca.crt\"" Apr 22 19:04:59.736906 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.736853 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"openshift-service-ca.crt\"" Apr 22 19:04:59.740254 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.740145 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw"] Apr 22 19:04:59.817746 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.817718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-lib-modules\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.817927 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.817767 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-sys\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.817927 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.817876 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-proc\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.817927 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.817914 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-podres\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.818058 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.817994 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-kube-api-access-wh97n\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918434 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918405 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-lib-modules\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-sys\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918482 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-proc\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918503 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-podres\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918541 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-kube-api-access-wh97n\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-sys\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.918641 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-proc\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.919071 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918592 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-lib-modules\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.919144 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.918637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-podres\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:04:59.932663 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:04:59.932627 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/0f0dbb01-4a50-41fe-8db8-20c8f97885f4-kube-api-access-wh97n\") pod \"perf-node-gather-daemonset-62xrw\" (UID: \"0f0dbb01-4a50-41fe-8db8-20c8f97885f4\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:05:00.043136 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.043055 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:05:00.180543 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.180517 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw"] Apr 22 19:05:00.183047 ip-10-0-143-11 kubenswrapper[2564]: W0422 19:05:00.183021 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0f0dbb01_4a50_41fe_8db8_20c8f97885f4.slice/crio-e2cc936244a50a50b6822db113d470731fcddd8914fe29951bc341fb837d950d WatchSource:0}: Error finding container e2cc936244a50a50b6822db113d470731fcddd8914fe29951bc341fb837d950d: Status 404 returned error can't find the container with id e2cc936244a50a50b6822db113d470731fcddd8914fe29951bc341fb837d950d Apr 22 19:05:00.420151 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.420119 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sgmcw_4b38c080-af4d-4d73-ad1f-c364849e7212/dns/0.log" Apr 22 19:05:00.445513 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.445484 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sgmcw_4b38c080-af4d-4d73-ad1f-c364849e7212/kube-rbac-proxy/0.log" Apr 22 19:05:00.502193 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.502167 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gglf5_494d8548-161a-40c6-aa4f-a43f0cb0ff07/dns-node-resolver/0.log" Apr 22 19:05:00.990169 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.990126 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" event={"ID":"0f0dbb01-4a50-41fe-8db8-20c8f97885f4","Type":"ContainerStarted","Data":"53867c493f1800c1ab3032759a8d04dc0efd27dd2ee257d6d7947475191ae7dd"} Apr 22 19:05:00.990169 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.990171 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" event={"ID":"0f0dbb01-4a50-41fe-8db8-20c8f97885f4","Type":"ContainerStarted","Data":"e2cc936244a50a50b6822db113d470731fcddd8914fe29951bc341fb837d950d"} Apr 22 19:05:00.990611 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:00.990308 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:05:01.016896 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:01.016826 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" podStartSLOduration=2.016810998 podStartE2EDuration="2.016810998s" podCreationTimestamp="2026-04-22 19:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:01.014464996 +0000 UTC m=+4282.492988419" watchObservedRunningTime="2026-04-22 19:05:01.016810998 +0000 UTC m=+4282.495334440" Apr 22 19:05:01.056997 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:01.056968 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d9mrk_58e27bc8-4e3e-4655-9e83-8aed674d5e93/node-ca/0.log" Apr 22 19:05:01.904619 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:01.904589 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-768bdd6654-j6n7q_92d95712-bf5b-4abb-8d3c-309ebdebbdd5/router/0.log" Apr 22 19:05:02.220474 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.220404 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bgpms_341b2cf5-e5b2-4950-98bb-f85daf6a0a5f/serve-healthcheck-canary/0.log" Apr 22 19:05:02.607364 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.607339 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-g6jpq_fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35/insights-operator/0.log" Apr 22 19:05:02.608949 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.608932 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-g6jpq_fa95ea66-4b22-4d17-b4f8-4c7e83ef4d35/insights-operator/1.log" Apr 22 19:05:02.779753 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.779727 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9fq46_c96db02a-45a4-45f2-8706-3e634cf21f29/kube-rbac-proxy/0.log" Apr 22 19:05:02.811639 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.811611 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9fq46_c96db02a-45a4-45f2-8706-3e634cf21f29/exporter/0.log" Apr 22 19:05:02.831729 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:02.831702 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9fq46_c96db02a-45a4-45f2-8706-3e634cf21f29/extractor/0.log" Apr 22 19:05:05.295605 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:05.295577 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-kl7mz_952446a9-3fb7-4b71-a583-1cc37945cedc/s3-init/0.log" Apr 22 19:05:07.003245 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:07.003219 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-62xrw" Apr 22 19:05:09.202727 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:09.202644 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-g44hn_5080e862-a8c9-455b-8f63-a505b89977b4/migrator/0.log" Apr 22 19:05:09.226012 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:09.225983 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-g44hn_5080e862-a8c9-455b-8f63-a505b89977b4/graceful-termination/0.log" Apr 22 19:05:09.640398 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:09.640362 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jpj98_ec52f02a-c723-4c97-a74b-1348c7d84b33/kube-storage-version-migrator-operator/1.log" Apr 22 19:05:09.641377 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:09.641353 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jpj98_ec52f02a-c723-4c97-a74b-1348c7d84b33/kube-storage-version-migrator-operator/0.log" Apr 22 19:05:10.889262 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.889233 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/kube-multus-additional-cni-plugins/0.log" Apr 22 19:05:10.914303 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.914273 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/egress-router-binary-copy/0.log" Apr 22 19:05:10.937542 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.937520 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/cni-plugins/0.log" Apr 22 19:05:10.958695 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.958628 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/bond-cni-plugin/0.log" Apr 22 19:05:10.979648 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.979626 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/routeoverride-cni/0.log" Apr 22 19:05:10.999475 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:10.999449 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/whereabouts-cni-bincopy/0.log" Apr 22 19:05:11.021363 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:11.021341 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdt5n_b86c7b56-b95e-4b34-8a02-a7cbb80decae/whereabouts-cni/0.log" Apr 22 19:05:11.049221 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:11.049197 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ctdsd_d41f25be-a0c4-4095-99e5-f6190accf5a8/kube-multus/0.log" Apr 22 19:05:11.222456 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:11.222378 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wcgxk_2c1e2467-3796-47ad-928c-f82f435261e9/network-metrics-daemon/0.log" Apr 22 19:05:11.243731 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:11.243693 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wcgxk_2c1e2467-3796-47ad-928c-f82f435261e9/kube-rbac-proxy/0.log" Apr 22 19:05:12.566197 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.566171 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-controller/0.log" Apr 22 19:05:12.588010 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.587978 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/0.log" Apr 22 19:05:12.605357 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.605330 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovn-acl-logging/1.log" Apr 22 19:05:12.625418 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.625396 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/kube-rbac-proxy-node/0.log" Apr 22 19:05:12.650675 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.650653 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:05:12.672599 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.672577 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/northd/0.log" Apr 22 19:05:12.698433 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.698414 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/nbdb/0.log" Apr 22 19:05:12.723174 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.723155 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/sbdb/0.log" Apr 22 19:05:12.814814 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:12.814781 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8zxzp_75123953-ef56-489a-8b07-e5d0a129fad3/ovnkube-controller/0.log" Apr 22 19:05:13.988172 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:13.988141 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rx62v_6d8e2e5c-104e-4adf-8584-d1f41c9d3b9c/network-check-target-container/0.log" Apr 22 19:05:14.905406 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:14.905367 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dwg7z_dc9de976-617d-407c-9074-f0ad44c2518d/iptables-alerter/0.log" Apr 22 19:05:15.610569 ip-10-0-143-11 kubenswrapper[2564]: I0422 19:05:15.610542 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tr9fb_bcc515e3-814e-41a4-9bbe-dc0050efd02c/tuned/0.log"