Apr 24 21:24:06.835356 ip-10-0-137-28 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:06.835368 ip-10-0-137-28 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:06.835375 ip-10-0-137-28 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:06.835636 ip-10-0-137-28 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:24:16.985599 ip-10-0-137-28 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:24:16.985616 ip-10-0-137-28 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7e8973327062487eb68810196ed61686 -- Apr 24 21:26:27.090138 ip-10-0-137-28 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:27.539299 ip-10-0-137-28 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:27.539299 ip-10-0-137-28 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:27.539299 ip-10-0-137-28 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:27.539299 ip-10-0-137-28 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:27.539299 ip-10-0-137-28 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:27.541349 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.541135 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:27.547757 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547734 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:27.547757 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547752 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:27.547757 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547756 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:27.547757 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547760 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:27.547757 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547763 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547766 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547769 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547774 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547777 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547779 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547784 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547788 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547791 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547794 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547797 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547800 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547803 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547806 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547809 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547811 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547814 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547816 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547819 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:27.547971 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547822 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547839 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547842 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547844 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547847 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547852 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547859 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547862 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547864 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547867 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547870 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547872 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547875 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547878 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547880 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547884 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547888 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547891 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547894 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:27.548448 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547897 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547899 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547902 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547904 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547907 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547910 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547912 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547915 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547917 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547919 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547922 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547925 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547927 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547930 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547933 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547935 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547938 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547941 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547944 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547946 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:27.548943 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547949 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547952 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547954 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547957 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547959 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547962 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547965 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547968 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547972 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547974 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547977 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547980 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547983 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547985 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547988 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.547991 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548000 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548003 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548005 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548008 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:27.549428 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548010 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548015 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548017 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548020 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548454 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548459 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548462 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548465 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548467 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548470 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548473 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548475 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548478 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548480 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548483 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548485 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548488 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548490 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548493 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548496 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:27.549935 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548499 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548502 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548505 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548507 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548510 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548513 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548515 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548518 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548526 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548529 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548532 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548534 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548536 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548539 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548542 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548546 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548549 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548551 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548554 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548557 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:27.550416 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548559 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548562 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548565 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548567 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548570 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548572 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548574 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548577 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548579 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548582 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548585 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548587 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548590 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548595 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548598 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548601 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548604 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548607 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548609 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:27.550959 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548612 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548614 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548623 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548626 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548628 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548631 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548633 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548636 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548638 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548641 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548643 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548646 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548648 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548651 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548653 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548656 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548658 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548660 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548663 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548666 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:27.551431 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548669 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548672 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548674 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548677 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548680 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548682 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548685 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548688 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548690 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548693 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.548696 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550269 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550279 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550289 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550293 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550304 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550307 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550312 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550316 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550319 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550322 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:27.551934 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550326 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550329 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550332 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550335 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550338 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550341 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550344 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550346 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550349 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550356 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550359 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550362 2574 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550365 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550368 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550373 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550377 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550380 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550384 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550387 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550390 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550393 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550396 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550399 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550403 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550406 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:27.552441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550409 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550412 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550420 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550424 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550431 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550434 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550437 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550440 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550443 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550447 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550450 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550453 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550456 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550459 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550462 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550465 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550467 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550470 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550473 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550476 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550480 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550483 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550486 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550489 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550492 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:27.553062 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550495 2574 flags.go:64] FLAG: --help="false" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550498 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550501 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550504 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550507 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550510 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550513 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550516 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550519 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550521 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550530 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550533 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550536 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550539 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550542 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550545 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550548 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550551 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550554 2574 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550556 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550559 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550562 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550568 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:27.553674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550570 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550573 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550576 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550579 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550582 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550585 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550588 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550592 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550595 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550599 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550602 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550605 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550608 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550611 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550614 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550616 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550619 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550627 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550630 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550633 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550642 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550645 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550650 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550653 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:27.554240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550656 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550659 2574 flags.go:64] FLAG: --port="10250" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550662 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550665 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d614fbbc0c295be0" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550668 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550671 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550674 2574 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550677 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550680 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550684 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550686 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550689 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550692 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550695 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550699 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550701 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550704 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550707 2574 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550710 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550713 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550716 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550719 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550721 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550724 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550727 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550731 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:27.554816 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550734 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550737 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550739 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550744 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550747 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550750 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550753 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550759 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550761 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550764 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550769 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550772 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550775 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550778 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550781 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550783 2574 flags.go:64] FLAG: --v="2" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550788 2574 flags.go:64] FLAG: --version="false" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550792 2574 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550797 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.550800 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550920 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550924 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550927 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550930 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:27.555459 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550932 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550935 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550937 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550940 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550942 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550945 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550951 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550954 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550956 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550960 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550964 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550968 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550972 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550975 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550978 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550980 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550983 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550986 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550988 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:27.556090 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550991 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550993 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550996 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.550998 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551001 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551003 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551006 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551008 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551011 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551015 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551017 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551020 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551022 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551025 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551027 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551030 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551033 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551035 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551038 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551041 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:27.556645 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551044 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551047 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551050 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551052 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551055 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551057 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551060 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551062 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551065 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551067 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551070 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551072 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551075 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551077 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551080 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551085 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551087 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551090 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551093 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:27.557175 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551097 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551100 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551103 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551106 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551108 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551111 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551113 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551116 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551118 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551121 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551123 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551127 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551130 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551132 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551135 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551138 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551140 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551143 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551146 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551149 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:27.557710 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551151 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:27.558241 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551154 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:27.558241 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.551157 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:27.558241 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.552095 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:27.560084 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.560062 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:27.560126 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.560085 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:27.560165 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560156 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:27.560165 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560164 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560167 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560171 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560174 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560177 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560179 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560182 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560186 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560188 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560191 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560194 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560197 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560199 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560202 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560205 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560207 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560210 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560212 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560216 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:27.560221 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560219 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560223 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560226 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560229 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560232 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560235 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560238 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560241 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560243 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560246 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560249 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560253 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560255 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560258 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560261 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560263 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560266 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560268 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560271 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560273 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560276 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:27.560690 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560278 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560281 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560283 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560286 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560289 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560291 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560294 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560296 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560299 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560301 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560304 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560307 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560309 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560313 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560315 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560318 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560322 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560326 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560329 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:27.561223 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560332 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560335 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560338 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560341 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560347 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560350 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560353 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560357 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560359 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560361 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560364 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560366 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560369 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560371 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560374 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560377 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560379 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560382 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560384 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560387 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:27.561733 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560389 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560392 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560396 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560400 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560402 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560405 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.560410 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560549 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560554 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560557 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560559 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560562 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560565 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560568 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560570 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:27.562320 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560573 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560575 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560579 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560581 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560584 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560587 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560589 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560592 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560594 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560597 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560599 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560602 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560604 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560607 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560610 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560612 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560615 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560617 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560620 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560623 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:27.562697 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560625 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560628 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560630 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560633 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560637 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560639 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560642 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560644 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560647 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560649 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560652 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560654 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560657 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560659 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560662 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560668 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560671 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560674 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560677 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560679 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:27.563201 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560682 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560684 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560687 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560689 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560692 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560695 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560698 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560700 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560703 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560706 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560708 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560711 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560713 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560715 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560719 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560723 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560726 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560730 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560732 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:27.563716 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560735 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560738 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560740 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560743 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560745 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560748 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560750 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560753 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560755 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560759 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560761 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560764 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560766 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560769 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560772 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560774 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560776 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560779 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:27.564203 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:27.560782 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:27.564642 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.560786 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:27.564642 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.561694 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:27.565056 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.565041 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:27.566154 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.566141 2574 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:27.566260 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.566244 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:27.566321 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.566293 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:27.595380 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.595353 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:27.598469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.598450 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:27.611814 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.611793 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:27.618270 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.618252 2574 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:27.622137 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.622110 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:27.627739 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.627722 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:27.628105 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.628082 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8d4ae604-58e4-49df-9a94-40ca75b1e436:/dev/nvme0n1p4 fdf89813-3dc4-40ba-8cf2-609614cfb070:/dev/nvme0n1p3] Apr 24 21:26:27.628153 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.628107 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:27.634249 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.634139 2574 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:27.631973506 +0000 UTC m=+0.439330105 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098428 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29f6db1cc28b0e5a70c4f01aa0a4e3 SystemUUID:ec29f6db-1cc2-8b0e-5a70-c4f01aa0a4e3 BootID:7e897332-7062-487e-b688-10196ed61686 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6e:9b:4d:d3:81 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6e:9b:4d:d3:81 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:e6:b4:f8:10:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:27.634249 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.634245 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:27.634368 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.634362 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:27.636286 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.636259 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:27.636432 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.636288 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-28.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:27.636474 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.636442 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:27.636474 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.636451 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:27.636474 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.636467 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:27.638175 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.638163 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:27.639392 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.639383 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:27.639525 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.639517 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:27.642174 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.642163 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:27.642213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.642181 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:27.642213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.642196 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:27.642213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.642205 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:27.642213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.642213 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:27.643368 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.643357 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:27.643406 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.643375 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:27.646857 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.646840 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:27.649221 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.649205 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:27.650762 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650748 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650769 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650776 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650790 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650796 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650802 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650808 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650813 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:27.650819 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650820 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:27.651038 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650839 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:27.651038 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650852 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:27.651038 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.650861 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:27.651565 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.651549 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fb5kx" Apr 24 21:26:27.651777 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.651766 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:27.651811 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.651781 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:27.655712 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.655694 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-28.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:27.655808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.655752 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:27.655808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.655782 2574 server.go:1295] "Started kubelet" Apr 24 21:26:27.655894 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.655863 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:27.656356 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.656308 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:27.656435 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.656375 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:27.656435 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.656412 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:27.656435 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.656412 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-28.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:27.656688 ip-10-0-137-28 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:27.660427 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.660403 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:27.661506 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.661483 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fb5kx" Apr 24 21:26:27.661955 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.661937 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:27.666494 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.665331 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-28.ec2.internal.18a96819b8a6761a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-28.ec2.internal,UID:ip-10-0-137-28.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-28.ec2.internal,},FirstTimestamp:2026-04-24 21:26:27.655759386 +0000 UTC m=+0.463115982,LastTimestamp:2026-04-24 21:26:27.655759386 +0000 UTC m=+0.463115982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-28.ec2.internal,}" Apr 24 21:26:27.669249 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.669228 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:27.669468 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.669441 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:27.670350 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.670334 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:27.671047 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671032 2574 factory.go:55] Registering systemd factory Apr 24 21:26:27.671145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671057 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:27.671145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671100 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:27.671145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671100 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:27.671145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671126 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:27.671327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671216 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:27.671327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671229 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:27.671327 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.671311 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:27.671457 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671445 2574 factory.go:153] Registering CRI-O factory Apr 24 21:26:27.671502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671461 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:27.671550 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671510 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:27.671550 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671548 2574 factory.go:103] Registering Raw factory Apr 24 21:26:27.671631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671564 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:27.671992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.671972 2574 manager.go:319] Starting recovery of all containers Apr 24 21:26:27.681002 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.680986 2574 manager.go:324] Recovery completed Apr 24 21:26:27.685134 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.685120 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.687820 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.687805 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.687923 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.687845 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.687923 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.687856 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.688079 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.688059 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:27.688322 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.688307 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:27.688322 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.688319 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:27.688413 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.688335 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:27.691573 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.691554 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-28.ec2.internal\" not found" node="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.692108 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.692097 2574 policy_none.go:49] "None policy: Start" Apr 24 21:26:27.692151 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.692113 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:27.692151 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.692122 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:27.744424 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744410 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.744470 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744480 2574 server.go:85] "Starting device plugin registration server" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744718 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744732 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744812 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744964 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.744975 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.745641 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:27.757428 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.745681 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:27.781778 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.781750 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:27.782915 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.782892 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:27.782990 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.782926 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:27.782990 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.782949 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:27.782990 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.782959 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:27.783101 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.783041 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:27.789164 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.789145 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:27.845641 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.845581 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.846355 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.846342 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.846419 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.846374 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.846419 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.846388 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.846419 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.846410 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.855215 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.855196 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.855215 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.855215 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-28.ec2.internal\": node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:27.869092 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.869075 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:27.883517 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.883499 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal"] Apr 24 21:26:27.883575 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.883560 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.884353 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.884340 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.884405 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.884368 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.884405 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.884382 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.886613 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.886601 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.887017 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.886742 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.887017 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.886775 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.887300 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887285 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.887382 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887311 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.887382 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887325 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.887382 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887359 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.887382 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887383 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.887554 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.887395 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.889557 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.889539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.889640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.889570 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:27.890246 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.890229 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:27.890309 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.890262 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:27.890309 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.890278 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:27.917458 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.917435 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-28.ec2.internal\" not found" node="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.920668 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.920652 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-28.ec2.internal\" not found" node="ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.969924 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:27.969900 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:27.973195 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.973182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562a113f040bbc989a373b07efb12bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-28.ec2.internal\" (UID: \"562a113f040bbc989a373b07efb12bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.973248 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.973204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:27.973248 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:27.973224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.069979 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.069947 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.074358 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.074452 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.074452 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.074452 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562a113f040bbc989a373b07efb12bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-28.ec2.internal\" (UID: \"562a113f040bbc989a373b07efb12bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.074586 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562a113f040bbc989a373b07efb12bcb-config\") pod \"kube-apiserver-proxy-ip-10-0-137-28.ec2.internal\" (UID: \"562a113f040bbc989a373b07efb12bcb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.074586 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.074515 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15783212e1fd5fb858f33c4536fa6518-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal\" (UID: \"15783212e1fd5fb858f33c4536fa6518\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.170407 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.170325 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.219680 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.219656 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.223156 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.223139 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:28.270649 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.270626 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.371088 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.371060 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.471565 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.471475 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.565914 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.565887 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:28.566560 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.566053 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:28.566560 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.566067 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:28.572102 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.572087 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.663698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.663633 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:27 +0000 UTC" deadline="2027-11-16 12:49:35.962535973 +0000 UTC" Apr 24 21:26:28.663698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.663692 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13695h23m7.298849144s" Apr 24 21:26:28.669883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.669864 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:28.672422 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.672404 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.684944 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.684923 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:28.704072 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.704042 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nmbmt" Apr 24 21:26:28.712020 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.711999 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nmbmt" Apr 24 21:26:28.772540 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.772469 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.814629 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:28.814447 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15783212e1fd5fb858f33c4536fa6518.slice/crio-b03f053cdf5c5c214088ed810edada1d6255dd3a3e29a75a92a7754181feb1f4 WatchSource:0}: Error finding container b03f053cdf5c5c214088ed810edada1d6255dd3a3e29a75a92a7754181feb1f4: Status 404 returned error can't find the container with id b03f053cdf5c5c214088ed810edada1d6255dd3a3e29a75a92a7754181feb1f4 Apr 24 21:26:28.815092 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:28.815071 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562a113f040bbc989a373b07efb12bcb.slice/crio-9d09546d37fd81fb45a8c2ecb73911386117ebaee080f4e26828002d5ab2f9c2 WatchSource:0}: Error finding container 9d09546d37fd81fb45a8c2ecb73911386117ebaee080f4e26828002d5ab2f9c2: Status 404 returned error can't find the container with id 9d09546d37fd81fb45a8c2ecb73911386117ebaee080f4e26828002d5ab2f9c2 Apr 24 21:26:28.820504 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.820486 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:28.873090 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.873059 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.958631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.958605 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:28.973986 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:28.973961 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-28.ec2.internal\" not found" Apr 24 21:26:28.982721 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:28.982702 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:29.070881 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.070791 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" Apr 24 21:26:29.082536 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.082508 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:29.083488 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.083475 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" Apr 24 21:26:29.093390 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.093367 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:29.442164 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.442073 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:29.444370 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.444336 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:29.643369 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.643337 2574 apiserver.go:52] "Watching apiserver" Apr 24 21:26:29.651561 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.651534 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:29.652557 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.652529 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-msllw","openshift-multus/network-metrics-daemon-9csmp","openshift-network-operator/iptables-alerter-6h52t","kube-system/konnectivity-agent-4lns9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz","openshift-image-registry/node-ca-l5t5z","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal","openshift-multus/multus-additional-cni-plugins-glnmw","openshift-network-diagnostics/network-check-target-lqj24","openshift-ovn-kubernetes/ovnkube-node-g9dbf","kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hwpr2"] Apr 24 21:26:29.657741 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.657714 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-msllw" Apr 24 21:26:29.660027 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.659998 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.660146 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.660072 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:29.660381 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.660356 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:29.660503 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.660412 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.660503 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.660487 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:29.660619 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.660597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-98czg\"" Apr 24 21:26:29.660711 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.660689 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.662280 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.662139 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.664395 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:29.664499 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664422 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.664499 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qjzxr\"" Apr 24 21:26:29.664594 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664449 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.664594 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.664701 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.664688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.667067 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.666773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zthhh\"" Apr 24 21:26:29.667067 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.666910 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:29.667067 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.667033 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.667271 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.667259 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:29.667347 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.667325 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rb7nx\"" Apr 24 21:26:29.667423 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.667356 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.667423 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.667261 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:29.669208 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.669189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.669300 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.669226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:29.669300 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.669286 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:29.671675 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.671566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gntnh\"" Apr 24 21:26:29.671675 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.671566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:29.671675 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.671612 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:29.673987 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.673943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.674077 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.674051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.676335 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.676311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.677639 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.677613 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.677728 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.677671 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.678591 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.678566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:29.678591 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.678582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:29.678727 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.678597 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.678727 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.678701 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rr9p4\"" Apr 24 21:26:29.679000 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.678985 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lkgpw\"" Apr 24 21:26:29.679125 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679105 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679188 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679006 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679071 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2bfqq\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679008 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:29.679469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.679374 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:29.681086 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-conf-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681177 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-systemd-units\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681177 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-slash\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681177 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-netd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681319 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-host-slash\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.681319 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-k8s-cni-cncf-io\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681319 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-multus\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681319 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-hostroot\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681319 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681289 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bae04246-77d0-46d3-9aa4-2a74e4817f4f-agent-certs\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bae04246-77d0-46d3-9aa4-2a74e4817f4f-konnectivity-ca\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-log-socket\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-config\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-script-lib\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-os-release\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8530569c-6697-47e7-b09a-7423346a9a16-ovn-node-metrics-cert\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-bin\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fh5t\" (UniqueName: \"kubernetes.io/projected/8530569c-6697-47e7-b09a-7423346a9a16-kube-api-access-7fh5t\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-iptables-alerter-script\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.681528 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-binary-copy\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-cni-binary-copy\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c22h2\" (UniqueName: \"kubernetes.io/projected/3cd95ab3-6a59-416f-8237-2554fc18b54f-kube-api-access-c22h2\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-kubelet\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-multus-certs\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mp4n\" (UniqueName: \"kubernetes.io/projected/8b42bf05-9792-4dd3-9486-e262d6b7afc8-kube-api-access-4mp4n\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681802 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-system-cni-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-etc-kubernetes\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.681951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-netns\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.681979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b42bf05-9792-4dd3-9486-e262d6b7afc8-serviceca\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-kubelet\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b42bf05-9792-4dd3-9486-e262d6b7afc8-host\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-cnibin\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682156 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-os-release\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-systemd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682249 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4gxq\" (UniqueName: \"kubernetes.io/projected/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-kube-api-access-t4gxq\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.682326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-bin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-netns\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682395 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvn6\" (UniqueName: \"kubernetes.io/projected/26ae5960-a77b-482e-891f-f7d7f829e0d2-kube-api-access-vlvn6\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-var-lib-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-ovn\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-daemon-config\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-etc-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-env-overrides\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-socket-dir-parent\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-node-log\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcbn\" (UniqueName: \"kubernetes.io/projected/8ed80245-164d-4d1c-8ed3-05523db4cd57-kube-api-access-tpcbn\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-system-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.682718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.682721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-cnibin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.712805 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.712776 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:28 +0000 UTC" deadline="2027-10-31 07:05:22.06441002 +0000 UTC" Apr 24 21:26:29.712805 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.712805 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13305h38m52.351608798s" Apr 24 21:26:29.772660 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.772629 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:29.783665 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-host-slash\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-k8s-cni-cncf-io\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-multus\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-host-slash\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-hostroot\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bae04246-77d0-46d3-9aa4-2a74e4817f4f-agent-certs\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-multus\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.783785 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bae04246-77d0-46d3-9aa4-2a74e4817f4f-konnectivity-ca\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-k8s-cni-cncf-io\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-hostroot\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-log-socket\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-config\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-script-lib\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-log-socket\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.783946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-modprobe-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-kubernetes\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccplh\" (UniqueName: \"kubernetes.io/projected/c1aab734-6300-41c5-9e50-7c87b69a3861-kube-api-access-ccplh\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.784146 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-socket-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-os-release\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8530569c-6697-47e7-b09a-7423346a9a16-ovn-node-metrics-cert\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784217 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-device-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-bin\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fh5t\" (UniqueName: \"kubernetes.io/projected/8530569c-6697-47e7-b09a-7423346a9a16-kube-api-access-7fh5t\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-systemd\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-registration-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bae04246-77d0-46d3-9aa4-2a74e4817f4f-konnectivity-ca\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-iptables-alerter-script\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-binary-copy\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.784604 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-config\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-cni-binary-copy\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjr9s\" (UniqueName: \"kubernetes.io/projected/38df65e2-fa0a-4a67-be3e-faf56659341f-kube-api-access-mjr9s\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-tuned\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c22h2\" (UniqueName: \"kubernetes.io/projected/3cd95ab3-6a59-416f-8237-2554fc18b54f-kube-api-access-c22h2\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-kubelet\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-multus-certs\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.784976 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-iptables-alerter-script\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-multus-certs\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mp4n\" (UniqueName: \"kubernetes.io/projected/8b42bf05-9792-4dd3-9486-e262d6b7afc8-kube-api-access-4mp4n\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-ovnkube-script-lib\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysconfig\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-system-cni-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785163 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-etc-kubernetes\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-netns\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-lib-modules\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-binary-copy\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b42bf05-9792-4dd3-9486-e262d6b7afc8-serviceca\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-os-release\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-bin\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-etc-kubernetes\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-kubelet\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-host\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-etc-selinux\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b42bf05-9792-4dd3-9486-e262d6b7afc8-host\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-cnibin\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.785949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785663 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-kubelet\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-os-release\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-netns\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-system-cni-dir\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cd95ab3-6a59-416f-8237-2554fc18b54f-cnibin\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b42bf05-9792-4dd3-9486-e262d6b7afc8-host\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-os-release\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-systemd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.785994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-kubelet\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-var-lib-kubelet\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-systemd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.786071 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4gxq\" (UniqueName: \"kubernetes.io/projected/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-kube-api-access-t4gxq\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b42bf05-9792-4dd3-9486-e262d6b7afc8-serviceca\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.786199 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.286139556 +0000 UTC m=+3.093496166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:29.786698 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-bin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-var-lib-cni-bin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-sys-fs\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-netns\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvn6\" (UniqueName: \"kubernetes.io/projected/26ae5960-a77b-482e-891f-f7d7f829e0d2-kube-api-access-vlvn6\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-host-run-netns\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-var-lib-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-var-lib-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-ovn\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-ovn\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-run-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3cd95ab3-6a59-416f-8237-2554fc18b54f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-daemon-config\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-etc-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.787482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.786979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-cni-binary-copy\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-daemon-config\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-env-overrides\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-tmp\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-etc-openvswitch\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-socket-dir-parent\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-node-log\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcbn\" (UniqueName: \"kubernetes.io/projected/8ed80245-164d-4d1c-8ed3-05523db4cd57-kube-api-access-tpcbn\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-socket-dir-parent\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-conf\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-node-log\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-run\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-system-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-system-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8530569c-6697-47e7-b09a-7423346a9a16-env-overrides\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.788272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-cni-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-cnibin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-conf-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-systemd-units\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-slash\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-netd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-sys\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.787660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-multus-conf-dir\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.788328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-slash\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.788369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-systemd-units\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.788387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26ae5960-a77b-482e-891f-f7d7f829e0d2-cnibin\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.789054 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.788442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8530569c-6697-47e7-b09a-7423346a9a16-host-cni-netd\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.789502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.789394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bae04246-77d0-46d3-9aa4-2a74e4817f4f-agent-certs\") pod \"konnectivity-agent-4lns9\" (UID: \"bae04246-77d0-46d3-9aa4-2a74e4817f4f\") " pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.791582 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.791090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8530569c-6697-47e7-b09a-7423346a9a16-ovn-node-metrics-cert\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.792141 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.792097 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" event={"ID":"562a113f040bbc989a373b07efb12bcb","Type":"ContainerStarted","Data":"9d09546d37fd81fb45a8c2ecb73911386117ebaee080f4e26828002d5ab2f9c2"} Apr 24 21:26:29.792974 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.792957 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:29.793070 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.792978 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:29.793070 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.792991 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:29.793402 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:29.793243 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.293182265 +0000 UTC m=+3.100538869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:29.794159 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.794095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" event={"ID":"15783212e1fd5fb858f33c4536fa6518","Type":"ContainerStarted","Data":"b03f053cdf5c5c214088ed810edada1d6255dd3a3e29a75a92a7754181feb1f4"} Apr 24 21:26:29.795915 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.795892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvn6\" (UniqueName: \"kubernetes.io/projected/26ae5960-a77b-482e-891f-f7d7f829e0d2-kube-api-access-vlvn6\") pod \"multus-msllw\" (UID: \"26ae5960-a77b-482e-891f-f7d7f829e0d2\") " pod="openshift-multus/multus-msllw" Apr 24 21:26:29.796213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.796191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4gxq\" (UniqueName: \"kubernetes.io/projected/9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de-kube-api-access-t4gxq\") pod \"iptables-alerter-6h52t\" (UID: \"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de\") " pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.796311 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.796220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c22h2\" (UniqueName: \"kubernetes.io/projected/3cd95ab3-6a59-416f-8237-2554fc18b54f-kube-api-access-c22h2\") pod \"multus-additional-cni-plugins-glnmw\" (UID: \"3cd95ab3-6a59-416f-8237-2554fc18b54f\") " pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:29.796405 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.796388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fh5t\" (UniqueName: \"kubernetes.io/projected/8530569c-6697-47e7-b09a-7423346a9a16-kube-api-access-7fh5t\") pod \"ovnkube-node-g9dbf\" (UID: \"8530569c-6697-47e7-b09a-7423346a9a16\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:29.796489 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.796469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mp4n\" (UniqueName: \"kubernetes.io/projected/8b42bf05-9792-4dd3-9486-e262d6b7afc8-kube-api-access-4mp4n\") pod \"node-ca-l5t5z\" (UID: \"8b42bf05-9792-4dd3-9486-e262d6b7afc8\") " pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.796673 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.796653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcbn\" (UniqueName: \"kubernetes.io/projected/8ed80245-164d-4d1c-8ed3-05523db4cd57-kube-api-access-tpcbn\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:29.889235 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjr9s\" (UniqueName: \"kubernetes.io/projected/38df65e2-fa0a-4a67-be3e-faf56659341f-kube-api-access-mjr9s\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889235 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-tuned\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysconfig\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-lib-modules\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-host\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-etc-selinux\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-var-lib-kubelet\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889455 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-sys-fs\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-tmp\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-conf\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-run\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-host\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-sys\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-modprobe-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-kubernetes\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-lib-modules\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccplh\" (UniqueName: \"kubernetes.io/projected/c1aab734-6300-41c5-9e50-7c87b69a3861-kube-api-access-ccplh\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-kubernetes\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-socket-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysconfig\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-device-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-systemd\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-registration-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.889808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-sys\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-modprobe-d\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-var-lib-kubelet\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-etc-selinux\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-sys-fs\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-sysctl-conf\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.889981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-run\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.890002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-registration-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.890022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-socket-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.890045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/38df65e2-fa0a-4a67-be3e-faf56659341f-device-dir\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.890556 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.890102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-systemd\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.891616 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.891579 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-etc-tuned\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.891865 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.891845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1aab734-6300-41c5-9e50-7c87b69a3861-tmp\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.897949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.897922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccplh\" (UniqueName: \"kubernetes.io/projected/c1aab734-6300-41c5-9e50-7c87b69a3861-kube-api-access-ccplh\") pod \"tuned-hwpr2\" (UID: \"c1aab734-6300-41c5-9e50-7c87b69a3861\") " pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:29.898169 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.898147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjr9s\" (UniqueName: \"kubernetes.io/projected/38df65e2-fa0a-4a67-be3e-faf56659341f-kube-api-access-mjr9s\") pod \"aws-ebs-csi-driver-node-hhvfz\" (UID: \"38df65e2-fa0a-4a67-be3e-faf56659341f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:29.969186 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.969068 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-msllw" Apr 24 21:26:29.977081 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.977049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6h52t" Apr 24 21:26:29.987808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.987785 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:29.992380 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.992354 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5t5z" Apr 24 21:26:29.998918 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:29.998899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-glnmw" Apr 24 21:26:30.006479 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.006456 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:30.014159 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.014136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" Apr 24 21:26:30.019693 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.019675 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" Apr 24 21:26:30.292763 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.292673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:30.292948 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.292823 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.292948 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.292904 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.292886601 +0000 UTC m=+4.100243189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.393592 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.393558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:30.393740 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.393679 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:30.393740 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.393692 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:30.393740 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.393702 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.393873 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:30.393751 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.39373848 +0000 UTC m=+4.201095063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.429486 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.429447 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b42bf05_9792_4dd3_9486_e262d6b7afc8.slice/crio-547d962cf2cc84743e20d55bfec5e3193c2e82a4ac43ad03838987fbe8fb7b50 WatchSource:0}: Error finding container 547d962cf2cc84743e20d55bfec5e3193c2e82a4ac43ad03838987fbe8fb7b50: Status 404 returned error can't find the container with id 547d962cf2cc84743e20d55bfec5e3193c2e82a4ac43ad03838987fbe8fb7b50 Apr 24 21:26:30.430781 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.430754 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8530569c_6697_47e7_b09a_7423346a9a16.slice/crio-e668028a4b3b2343c29e7b013c0fc557a0822a04526ec35a903541e6aa0f3d48 WatchSource:0}: Error finding container e668028a4b3b2343c29e7b013c0fc557a0822a04526ec35a903541e6aa0f3d48: Status 404 returned error can't find the container with id e668028a4b3b2343c29e7b013c0fc557a0822a04526ec35a903541e6aa0f3d48 Apr 24 21:26:30.432280 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.432191 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae04246_77d0_46d3_9aa4_2a74e4817f4f.slice/crio-94222b6554cef853d24d15b0ab0dcf8161acd86227b16291a5ac3f2a143f21a3 WatchSource:0}: Error finding container 94222b6554cef853d24d15b0ab0dcf8161acd86227b16291a5ac3f2a143f21a3: Status 404 returned error can't find the container with id 94222b6554cef853d24d15b0ab0dcf8161acd86227b16291a5ac3f2a143f21a3 Apr 24 21:26:30.435522 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.435500 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1aab734_6300_41c5_9e50_7c87b69a3861.slice/crio-66c39b5ce04dfa97b9a7b9eab5598beee2707ff28783fe9a466f17715d29b40b WatchSource:0}: Error finding container 66c39b5ce04dfa97b9a7b9eab5598beee2707ff28783fe9a466f17715d29b40b: Status 404 returned error can't find the container with id 66c39b5ce04dfa97b9a7b9eab5598beee2707ff28783fe9a466f17715d29b40b Apr 24 21:26:30.436896 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.436866 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5b0fff_78bc_4c1d_8ac3_bc6cf30d06de.slice/crio-df594674775511b076b9242436901652ab2bdf1cf0c94f58a89d01453a385600 WatchSource:0}: Error finding container df594674775511b076b9242436901652ab2bdf1cf0c94f58a89d01453a385600: Status 404 returned error can't find the container with id df594674775511b076b9242436901652ab2bdf1cf0c94f58a89d01453a385600 Apr 24 21:26:30.437669 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.437590 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38df65e2_fa0a_4a67_be3e_faf56659341f.slice/crio-0ae836555ca61b8ee4d24cd77b18af79d816e68d6b94e9b7681f96b44389aebf WatchSource:0}: Error finding container 0ae836555ca61b8ee4d24cd77b18af79d816e68d6b94e9b7681f96b44389aebf: Status 404 returned error can't find the container with id 0ae836555ca61b8ee4d24cd77b18af79d816e68d6b94e9b7681f96b44389aebf Apr 24 21:26:30.439197 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.439176 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd95ab3_6a59_416f_8237_2554fc18b54f.slice/crio-4e0c72b7967cd2c12add63958ab06ba32e8471fcec1c007e11c0a4c133b8fc75 WatchSource:0}: Error finding container 4e0c72b7967cd2c12add63958ab06ba32e8471fcec1c007e11c0a4c133b8fc75: Status 404 returned error can't find the container with id 4e0c72b7967cd2c12add63958ab06ba32e8471fcec1c007e11c0a4c133b8fc75 Apr 24 21:26:30.441174 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:30.440948 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ae5960_a77b_482e_891f_f7d7f829e0d2.slice/crio-7b8c45f78376403a43695b82be76181745d445f58d4d6b05533326acc3a3b044 WatchSource:0}: Error finding container 7b8c45f78376403a43695b82be76181745d445f58d4d6b05533326acc3a3b044: Status 404 returned error can't find the container with id 7b8c45f78376403a43695b82be76181745d445f58d4d6b05533326acc3a3b044 Apr 24 21:26:30.713945 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.713868 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:28 +0000 UTC" deadline="2027-12-31 14:18:40.738488958 +0000 UTC" Apr 24 21:26:30.713945 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.713898 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14776h52m10.02459308s" Apr 24 21:26:30.808707 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.808619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6h52t" event={"ID":"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de","Type":"ContainerStarted","Data":"df594674775511b076b9242436901652ab2bdf1cf0c94f58a89d01453a385600"} Apr 24 21:26:30.813052 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.812984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" event={"ID":"c1aab734-6300-41c5-9e50-7c87b69a3861","Type":"ContainerStarted","Data":"66c39b5ce04dfa97b9a7b9eab5598beee2707ff28783fe9a466f17715d29b40b"} Apr 24 21:26:30.826927 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.826889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4lns9" event={"ID":"bae04246-77d0-46d3-9aa4-2a74e4817f4f","Type":"ContainerStarted","Data":"94222b6554cef853d24d15b0ab0dcf8161acd86227b16291a5ac3f2a143f21a3"} Apr 24 21:26:30.835482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.835434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5t5z" event={"ID":"8b42bf05-9792-4dd3-9486-e262d6b7afc8","Type":"ContainerStarted","Data":"547d962cf2cc84743e20d55bfec5e3193c2e82a4ac43ad03838987fbe8fb7b50"} Apr 24 21:26:30.841219 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.840571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" event={"ID":"562a113f040bbc989a373b07efb12bcb","Type":"ContainerStarted","Data":"0be25ef18b0b0788645f7143ae51783e5c45f1a5bad88fc636124427100ce6c2"} Apr 24 21:26:30.844994 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.844971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-msllw" event={"ID":"26ae5960-a77b-482e-891f-f7d7f829e0d2","Type":"ContainerStarted","Data":"7b8c45f78376403a43695b82be76181745d445f58d4d6b05533326acc3a3b044"} Apr 24 21:26:30.849588 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.849561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerStarted","Data":"4e0c72b7967cd2c12add63958ab06ba32e8471fcec1c007e11c0a4c133b8fc75"} Apr 24 21:26:30.859505 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.859453 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" event={"ID":"38df65e2-fa0a-4a67-be3e-faf56659341f","Type":"ContainerStarted","Data":"0ae836555ca61b8ee4d24cd77b18af79d816e68d6b94e9b7681f96b44389aebf"} Apr 24 21:26:30.878056 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:30.878026 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"e668028a4b3b2343c29e7b013c0fc557a0822a04526ec35a903541e6aa0f3d48"} Apr 24 21:26:31.300416 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.300373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:31.300597 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.300550 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.300660 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.300616 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:33.300595735 +0000 UTC m=+6.107952323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.401608 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.400980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:31.401608 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.401172 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:31.401608 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.401194 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:31.401608 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.401207 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:31.401608 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.401269 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:33.401248442 +0000 UTC m=+6.208605028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:31.786284 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.786252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:31.786598 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.786384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:31.786786 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.786770 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:31.786920 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:31.786875 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:31.898657 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.898578 2574 generic.go:358] "Generic (PLEG): container finished" podID="15783212e1fd5fb858f33c4536fa6518" containerID="f815d47a76f6409dfe6ecab2cde1febfe19a4bc949431cf467aba8bf75126f6c" exitCode=0 Apr 24 21:26:31.898802 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.898685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" event={"ID":"15783212e1fd5fb858f33c4536fa6518","Type":"ContainerDied","Data":"f815d47a76f6409dfe6ecab2cde1febfe19a4bc949431cf467aba8bf75126f6c"} Apr 24 21:26:31.914574 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:31.913811 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-28.ec2.internal" podStartSLOduration=2.9137944129999998 podStartE2EDuration="2.913794413s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:30.857328574 +0000 UTC m=+3.664685180" watchObservedRunningTime="2026-04-24 21:26:31.913794413 +0000 UTC m=+4.721151022" Apr 24 21:26:32.903961 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:32.903925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" event={"ID":"15783212e1fd5fb858f33c4536fa6518","Type":"ContainerStarted","Data":"a16deb3240e0db1b2560d675e6d80d8111a8be0833740ae786018d79eb462760"} Apr 24 21:26:32.921506 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:32.921456 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-28.ec2.internal" podStartSLOduration=3.921437707 podStartE2EDuration="3.921437707s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:32.921325447 +0000 UTC m=+5.728682053" watchObservedRunningTime="2026-04-24 21:26:32.921437707 +0000 UTC m=+5.728794313" Apr 24 21:26:33.317492 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.316721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:33.317492 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.316925 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:33.317492 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.316989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:37.316970301 +0000 UTC m=+10.124326887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:33.418193 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.417568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:33.418193 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.417746 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:33.418193 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.417763 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:33.418193 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.417776 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:33.418193 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.417861 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:37.417839215 +0000 UTC m=+10.225195818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:33.651685 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.650909 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5k6kg"] Apr 24 21:26:33.654288 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.654234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.662712 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.662495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84r2g\"" Apr 24 21:26:33.662798 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.662772 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:33.663013 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.662994 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:33.720634 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.720597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00622ba2-e987-487b-870b-1558450fa114-tmp-dir\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.720809 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.720685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00622ba2-e987-487b-870b-1558450fa114-hosts-file\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.720809 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.720717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx69z\" (UniqueName: \"kubernetes.io/projected/00622ba2-e987-487b-870b-1558450fa114-kube-api-access-rx69z\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.785649 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.784973 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:33.785649 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.785106 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:33.785649 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.785521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:33.785649 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:33.785606 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:33.822400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.821809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00622ba2-e987-487b-870b-1558450fa114-hosts-file\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.822400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.821892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx69z\" (UniqueName: \"kubernetes.io/projected/00622ba2-e987-487b-870b-1558450fa114-kube-api-access-rx69z\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.822400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.821942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00622ba2-e987-487b-870b-1558450fa114-tmp-dir\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.822400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.822182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00622ba2-e987-487b-870b-1558450fa114-hosts-file\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.822400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.822353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00622ba2-e987-487b-870b-1558450fa114-tmp-dir\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.835762 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.835709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx69z\" (UniqueName: \"kubernetes.io/projected/00622ba2-e987-487b-870b-1558450fa114-kube-api-access-rx69z\") pod \"node-resolver-5k6kg\" (UID: \"00622ba2-e987-487b-870b-1558450fa114\") " pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:33.967738 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:33.967605 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5k6kg" Apr 24 21:26:35.783648 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:35.783617 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:35.784131 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:35.783743 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:35.784131 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:35.783843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:35.784131 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:35.783949 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:37.350930 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:37.350891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:37.351488 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.351052 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:37.351488 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.351131 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.351108447 +0000 UTC m=+18.158465033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:37.451765 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:37.451715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:37.451966 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.451902 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:37.451966 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.451931 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:37.451966 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.451945 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:37.452118 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.452010 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.451990782 +0000 UTC m=+18.259347365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:37.788728 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:37.786356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:37.788728 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.786756 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:37.788728 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:37.786844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:37.788728 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:37.786943 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:39.783672 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.783628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:39.784145 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:39.783785 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:39.784145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.783628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:39.784283 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:39.784260 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:39.921965 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.921876 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9fjb6"] Apr 24 21:26:39.924721 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.924699 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:39.924860 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:39.924780 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:39.969034 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.968997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-kubelet-config\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:39.969202 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.969039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-dbus\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:39.969202 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:39.969162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070413 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.070325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-kubelet-config\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070413 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.070365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-dbus\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070625 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.070430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070625 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.070497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-kubelet-config\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070625 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:40.070552 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:40.070625 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.070557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2b28d774-e7a8-450d-9ac2-f68dc752098e-dbus\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.070625 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:40.070618 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:40.570597687 +0000 UTC m=+13.377954290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:40.576279 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:40.576247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:40.576450 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:40.576387 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:40.576504 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:40.576461 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:41.576440201 +0000 UTC m=+14.383796785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:41.584989 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:41.584949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:41.585491 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:41.585115 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:41.585491 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:41.585186 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.585168252 +0000 UTC m=+16.392524834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:41.783664 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:41.783629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:41.783847 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:41.783630 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:41.783847 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:41.783750 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:41.783948 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:41.783885 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:41.783948 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:41.783628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:41.784064 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:41.784041 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:43.599310 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:43.599280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:43.599710 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:43.599395 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:43.599710 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:43.599454 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:47.599438337 +0000 UTC m=+20.406794924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:43.783229 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:43.783187 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:43.783404 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:43.783187 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:43.783404 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:43.783314 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:43.783404 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:43.783369 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:43.783404 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:43.783187 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:43.783579 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:43.783446 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:45.409960 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:45.409919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:45.410464 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.410084 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:45.410464 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.410147 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.410132174 +0000 UTC m=+34.217488761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:45.510540 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:45.510494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:45.510766 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.510676 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:45.510766 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.510698 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:45.510766 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.510708 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:45.510766 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.510765 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.510746675 +0000 UTC m=+34.318103282 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:45.783360 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:45.783263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:45.783518 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:45.783263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:45.783518 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.783405 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:45.783518 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:45.783278 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:45.783518 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.783489 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:45.783699 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:45.783614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:47.013966 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:26:47.013931 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00622ba2_e987_487b_870b_1558450fa114.slice/crio-bebf29282f5857bae31886e9138da697c313e95308485f7ba22dc5b402df5571 WatchSource:0}: Error finding container bebf29282f5857bae31886e9138da697c313e95308485f7ba22dc5b402df5571: Status 404 returned error can't find the container with id bebf29282f5857bae31886e9138da697c313e95308485f7ba22dc5b402df5571 Apr 24 21:26:47.625525 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.625364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:47.625646 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:47.625527 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:47.625646 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:47.625633 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.625611745 +0000 UTC m=+28.432968343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:47.784359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.784335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:47.784480 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:47.784437 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:47.784550 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.784525 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:47.784716 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:47.784636 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:47.784716 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.784696 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:47.784875 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:47.784818 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:47.931287 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.931256 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" event={"ID":"c1aab734-6300-41c5-9e50-7c87b69a3861","Type":"ContainerStarted","Data":"b7370e984dbc0cf4b2af882959d599b2fe8b4ee71d1730f125896f1535095666"} Apr 24 21:26:47.932491 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.932461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4lns9" event={"ID":"bae04246-77d0-46d3-9aa4-2a74e4817f4f","Type":"ContainerStarted","Data":"61c18d89b9c26f6d1b8fd22b45ec301c01ed3a14b6ac546d9fa5ceb3882e50e2"} Apr 24 21:26:47.933617 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.933595 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5t5z" event={"ID":"8b42bf05-9792-4dd3-9486-e262d6b7afc8","Type":"ContainerStarted","Data":"df927e558c786197388e9dee73265145b535998b1e527af7b02e98c472547581"} Apr 24 21:26:47.934767 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.934748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5k6kg" event={"ID":"00622ba2-e987-487b-870b-1558450fa114","Type":"ContainerStarted","Data":"80d99124c127edb7fc4f3ad3f6affa8ea05ed6d52f4df6a1ba44c8fdbbfae423"} Apr 24 21:26:47.934891 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.934779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5k6kg" event={"ID":"00622ba2-e987-487b-870b-1558450fa114","Type":"ContainerStarted","Data":"bebf29282f5857bae31886e9138da697c313e95308485f7ba22dc5b402df5571"} Apr 24 21:26:47.935952 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.935930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-msllw" event={"ID":"26ae5960-a77b-482e-891f-f7d7f829e0d2","Type":"ContainerStarted","Data":"048fd77d539953898ef913448909aa94426ef14434ed844bcf962311b7a3fa45"} Apr 24 21:26:47.937144 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.937125 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="305b974a5e3d1334aea9a346bf93bd552715585f31c9e538e8640e0ade70009b" exitCode=0 Apr 24 21:26:47.937220 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.937178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"305b974a5e3d1334aea9a346bf93bd552715585f31c9e538e8640e0ade70009b"} Apr 24 21:26:47.938568 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.938502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" event={"ID":"38df65e2-fa0a-4a67-be3e-faf56659341f","Type":"ContainerStarted","Data":"b7a91e92a0cfd1c3552ee958f3b8b1d8aa76ef8583d6fc28bd49f404c479f511"} Apr 24 21:26:47.940962 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.940945 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:26:47.941295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941276 2574 generic.go:358] "Generic (PLEG): container finished" podID="8530569c-6697-47e7-b09a-7423346a9a16" containerID="38736d651e076e548e1742eb968a07ff2b26ebd9652f89d22fb19ea85974dde1" exitCode=1 Apr 24 21:26:47.941363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941303 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"08d1cb70ed4f4d2bdc017ce5fb3d3e03fde657f00e3f006aec020c8194375e96"} Apr 24 21:26:47.941363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"a7f2a7b33ebdd58630ea38352e211339e328a39def21d4f29c1eed8164eda531"} Apr 24 21:26:47.941363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"1c15bb26eb2e54a8340597b16a3fc10b06b9dcb1634e0e27cb5e73cb4608b89c"} Apr 24 21:26:47.941363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerDied","Data":"38736d651e076e548e1742eb968a07ff2b26ebd9652f89d22fb19ea85974dde1"} Apr 24 21:26:47.941363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.941354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"d71a4c859c90840ba7f0385a4f45a00ee830832f54f9484d94053593b6de6e84"} Apr 24 21:26:47.953618 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.953572 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hwpr2" podStartSLOduration=4.379357906 podStartE2EDuration="20.953557863s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.437955441 +0000 UTC m=+3.245312023" lastFinishedPulling="2026-04-24 21:26:47.012155381 +0000 UTC m=+19.819511980" observedRunningTime="2026-04-24 21:26:47.953130944 +0000 UTC m=+20.760487548" watchObservedRunningTime="2026-04-24 21:26:47.953557863 +0000 UTC m=+20.760914473" Apr 24 21:26:47.968793 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:47.968741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l5t5z" podStartSLOduration=4.389242602 podStartE2EDuration="20.96872701s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.431370904 +0000 UTC m=+3.238727488" lastFinishedPulling="2026-04-24 21:26:47.01085531 +0000 UTC m=+19.818211896" observedRunningTime="2026-04-24 21:26:47.967841518 +0000 UTC m=+20.775198114" watchObservedRunningTime="2026-04-24 21:26:47.96872701 +0000 UTC m=+20.776083617" Apr 24 21:26:48.007932 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.007885 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-msllw" podStartSLOduration=4.40259287 podStartE2EDuration="21.007871295s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.442526498 +0000 UTC m=+3.249883084" lastFinishedPulling="2026-04-24 21:26:47.047804911 +0000 UTC m=+19.855161509" observedRunningTime="2026-04-24 21:26:48.007486155 +0000 UTC m=+20.814842757" watchObservedRunningTime="2026-04-24 21:26:48.007871295 +0000 UTC m=+20.815227901" Apr 24 21:26:48.024273 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.024178 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4lns9" podStartSLOduration=8.952196644 podStartE2EDuration="21.024163044s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.435662385 +0000 UTC m=+3.243018968" lastFinishedPulling="2026-04-24 21:26:42.50762877 +0000 UTC m=+15.314985368" observedRunningTime="2026-04-24 21:26:48.023710992 +0000 UTC m=+20.831067596" watchObservedRunningTime="2026-04-24 21:26:48.024163044 +0000 UTC m=+20.831519650" Apr 24 21:26:48.038392 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.038353 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5k6kg" podStartSLOduration=15.038338036 podStartE2EDuration="15.038338036s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:48.037845375 +0000 UTC m=+20.845201975" watchObservedRunningTime="2026-04-24 21:26:48.038338036 +0000 UTC m=+20.845694640" Apr 24 21:26:48.438745 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.438691 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:48.439396 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.439372 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:48.481893 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.481865 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:48.755754 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.755651 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:48.481888547Z","UUID":"c0eabde9-9f0e-44ad-9365-b2507311d8b9","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:48.758787 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.758753 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:48.758787 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.758782 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:48.945575 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.945483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" event={"ID":"38df65e2-fa0a-4a67-be3e-faf56659341f","Type":"ContainerStarted","Data":"d9ed8554e6ed89b8492439a6db696ad8a71e8652a2b4dbb1f207291daba42529"} Apr 24 21:26:48.949030 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.949004 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:26:48.949469 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.949439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"a49e24eede371cfd563f9cfd4548c5edf3945132352ba4aeb75e8f30c4980577"} Apr 24 21:26:48.951059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.950995 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6h52t" event={"ID":"9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de","Type":"ContainerStarted","Data":"4286d85c80aff9de5c68f2373a66a263014ebe4b107bbb9e6fea7cce6e73359d"} Apr 24 21:26:48.951251 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.951202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:48.951960 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.951940 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4lns9" Apr 24 21:26:48.968308 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:48.968254 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6h52t" podStartSLOduration=5.396133191 podStartE2EDuration="21.968241748s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.438737679 +0000 UTC m=+3.246094268" lastFinishedPulling="2026-04-24 21:26:47.010846242 +0000 UTC m=+19.818202825" observedRunningTime="2026-04-24 21:26:48.967738851 +0000 UTC m=+21.775095457" watchObservedRunningTime="2026-04-24 21:26:48.968241748 +0000 UTC m=+21.775598352" Apr 24 21:26:49.783975 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:49.783881 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:49.783975 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:49.783905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:49.783975 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:49.783881 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:49.784968 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:49.784017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:49.784968 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:49.784118 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:49.784968 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:49.784281 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:49.955321 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:49.955273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" event={"ID":"38df65e2-fa0a-4a67-be3e-faf56659341f","Type":"ContainerStarted","Data":"0983540d9abab16b21dc0a96d14199d3f4e4bf0e1d2368b66bc78d3c29bde4ec"} Apr 24 21:26:49.974594 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:49.974547 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hhvfz" podStartSLOduration=2.905780435 podStartE2EDuration="21.974532339s" podCreationTimestamp="2026-04-24 21:26:28 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.43988445 +0000 UTC m=+3.247241036" lastFinishedPulling="2026-04-24 21:26:49.508636344 +0000 UTC m=+22.315992940" observedRunningTime="2026-04-24 21:26:49.974066247 +0000 UTC m=+22.781422852" watchObservedRunningTime="2026-04-24 21:26:49.974532339 +0000 UTC m=+22.781888944" Apr 24 21:26:50.960864 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:50.960812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:26:50.961356 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:50.961231 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"ec552650f30816eeae1061376d5826b29869579b9808ee91402e14b06bce3d2f"} Apr 24 21:26:51.783324 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:51.783291 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:51.783516 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:51.783291 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:51.783516 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:51.783429 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:51.783617 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:51.783519 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:51.783617 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:51.783291 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:51.783617 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:51.783603 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:52.967817 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.967653 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:26:52.968247 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.968167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"eb3bcf198f4d4726356c261a3e9edf0e028914ef125bf269cb49fce590aeebd0"} Apr 24 21:26:52.968479 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.968453 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:52.968479 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.968483 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:52.968678 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.968639 2574 scope.go:117] "RemoveContainer" containerID="38736d651e076e548e1742eb968a07ff2b26ebd9652f89d22fb19ea85974dde1" Apr 24 21:26:52.970036 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.970011 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="b34c7075bd6ce9764ddcccf4dba14578dfaa4bf743be72e2319e8a4111e128e3" exitCode=0 Apr 24 21:26:52.970135 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.970047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"b34c7075bd6ce9764ddcccf4dba14578dfaa4bf743be72e2319e8a4111e128e3"} Apr 24 21:26:52.983433 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:52.983416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:53.783867 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.783679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:53.783984 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.783686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:53.783984 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:53.783955 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:53.784067 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.783721 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:53.784067 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:53.784021 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:53.784136 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:53.784068 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:53.974240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.974207 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="9c4343370ac182d15fa09d63228f4d744490cf043af86b0d417788e377e81e74" exitCode=0 Apr 24 21:26:53.974645 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.974286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"9c4343370ac182d15fa09d63228f4d744490cf043af86b0d417788e377e81e74"} Apr 24 21:26:53.977786 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.977768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:26:53.978134 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.978113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" event={"ID":"8530569c-6697-47e7-b09a-7423346a9a16","Type":"ContainerStarted","Data":"779775a14aacfcdfb5de09e0e29dd1992a6eabbc77219b53be9c8e6fe39d6378"} Apr 24 21:26:53.978420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.978398 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:53.991865 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:53.991841 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:26:54.021985 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.021935 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" podStartSLOduration=10.117723253 podStartE2EDuration="27.02192151s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.434586276 +0000 UTC m=+3.241942860" lastFinishedPulling="2026-04-24 21:26:47.338784529 +0000 UTC m=+20.146141117" observedRunningTime="2026-04-24 21:26:54.02180062 +0000 UTC m=+26.829157225" watchObservedRunningTime="2026-04-24 21:26:54.02192151 +0000 UTC m=+26.829278148" Apr 24 21:26:54.348766 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.348692 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9csmp"] Apr 24 21:26:54.348905 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.348815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:54.348978 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:54.348955 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:54.349432 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.349414 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9fjb6"] Apr 24 21:26:54.349498 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.349488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:54.349606 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:54.349589 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:54.350405 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.350378 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lqj24"] Apr 24 21:26:54.350518 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.350451 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:54.350568 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:54.350522 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:54.982240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.982161 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="52db1c547cf444aa6781482ec475f73279dcaec773eb3105fa1b6cf2662b44ef" exitCode=0 Apr 24 21:26:54.982805 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:54.982238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"52db1c547cf444aa6781482ec475f73279dcaec773eb3105fa1b6cf2662b44ef"} Apr 24 21:26:55.684727 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:55.684688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:55.684910 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:55.684841 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:55.684910 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:55.684905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret podName:2b28d774-e7a8-450d-9ac2-f68dc752098e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.684884488 +0000 UTC m=+44.492241076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret") pod "global-pull-secret-syncer-9fjb6" (UID: "2b28d774-e7a8-450d-9ac2-f68dc752098e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:55.783727 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:55.783677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:55.783940 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:55.783677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:55.783940 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:55.783810 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:55.783940 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:55.783856 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:55.784190 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:55.783990 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:55.784190 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:55.784061 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:57.786764 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:57.786598 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:57.787255 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:57.786600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:57.787255 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:57.786870 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:26:57.787255 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:57.786637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:57.787255 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:57.786961 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:57.787434 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:57.787403 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:59.786780 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:59.786750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:26:59.787183 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:59.786754 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:26:59.787183 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:59.786878 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lqj24" podUID="a6bcdb22-0356-4540-8553-9a968d14ba41" Apr 24 21:26:59.787183 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:26:59.786755 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:26:59.787183 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:59.786935 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9fjb6" podUID="2b28d774-e7a8-450d-9ac2-f68dc752098e" Apr 24 21:26:59.787183 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:26:59.787004 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:27:00.063415 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.063325 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-28.ec2.internal" event="NodeReady" Apr 24 21:27:00.063580 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.063470 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:00.107668 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.107631 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w6kmm"] Apr 24 21:27:00.140408 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.140382 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p9djn"] Apr 24 21:27:00.140619 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.140579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.143297 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.143274 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w6kmm"] Apr 24 21:27:00.143438 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.143304 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p9djn"] Apr 24 21:27:00.143438 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.143391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.143901 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.143880 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:00.144147 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.144127 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:27:00.144630 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.144598 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:00.145563 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.145546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:00.145735 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.145720 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:00.145844 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.145778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:27:00.146072 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.146056 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:00.219028 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.218994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8th\" (UniqueName: \"kubernetes.io/projected/258d3fbf-bfd8-408b-9638-e130192183f7-kube-api-access-rk8th\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.219195 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.219071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d3fbf-bfd8-408b-9638-e130192183f7-tmp-dir\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.219195 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.219095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.219195 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.219134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.219195 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.219160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4t4z\" (UniqueName: \"kubernetes.io/projected/07d12594-0cd4-4f7e-8c3a-c529a1051347-kube-api-access-l4t4z\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.219393 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.219206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258d3fbf-bfd8-408b-9638-e130192183f7-config-volume\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.320530 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8th\" (UniqueName: \"kubernetes.io/projected/258d3fbf-bfd8-408b-9638-e130192183f7-kube-api-access-rk8th\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.320530 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d3fbf-bfd8-408b-9638-e130192183f7-tmp-dir\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.320745 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.320745 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.320745 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4t4z\" (UniqueName: \"kubernetes.io/projected/07d12594-0cd4-4f7e-8c3a-c529a1051347-kube-api-access-l4t4z\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.320745 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.320650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258d3fbf-bfd8-408b-9638-e130192183f7-config-volume\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.320745 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.320685 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:00.320973 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.320752 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:00.320973 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.320754 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.820732246 +0000 UTC m=+33.628088845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:00.320973 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.320819 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.820802572 +0000 UTC m=+33.628159158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:00.321070 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.321004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d3fbf-bfd8-408b-9638-e130192183f7-tmp-dir\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.321306 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.321285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258d3fbf-bfd8-408b-9638-e130192183f7-config-volume\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.335901 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.335872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8th\" (UniqueName: \"kubernetes.io/projected/258d3fbf-bfd8-408b-9638-e130192183f7-kube-api-access-rk8th\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.336123 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.336101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4t4z\" (UniqueName: \"kubernetes.io/projected/07d12594-0cd4-4f7e-8c3a-c529a1051347-kube-api-access-l4t4z\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.823247 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.823217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:00.823540 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.823271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:00.823540 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.823376 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:00.823540 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.823392 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:00.823540 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.823454 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.8234327 +0000 UTC m=+34.630789283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:00.823540 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:00.823470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.82346282 +0000 UTC m=+34.630819403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:00.996542 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.996507 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="2819940dc49f10da971d9daeee2d535bfb1d98c6e0db71a3bf927fb8d1350c07" exitCode=0 Apr 24 21:27:00.996713 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:00.996564 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"2819940dc49f10da971d9daeee2d535bfb1d98c6e0db71a3bf927fb8d1350c07"} Apr 24 21:27:01.428549 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.428510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:27:01.428732 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.428677 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:01.428778 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.428748 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.42873264 +0000 UTC m=+66.236089223 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:01.529198 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.529159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:01.529349 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.529325 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:01.529403 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.529351 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:01.529403 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.529361 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nn9fp for pod openshift-network-diagnostics/network-check-target-lqj24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:01.529465 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.529414 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp podName:a6bcdb22-0356-4540-8553-9a968d14ba41 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.529399559 +0000 UTC m=+66.336756143 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nn9fp" (UniqueName: "kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp") pod "network-check-target-lqj24" (UID: "a6bcdb22-0356-4540-8553-9a968d14ba41") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:01.786099 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.786075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:27:01.786099 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.786084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:01.786285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.786074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:27:01.788697 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.788681 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:01.788796 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.788773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:27:01.789615 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.789594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:01.789742 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.789635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:01.789742 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.789683 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9whs\"" Apr 24 21:27:01.789742 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.789729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:01.831362 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.831324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:01.831362 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:01.831368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:01.831756 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.831478 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:01.831756 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.831486 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:01.831756 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.831553 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.831533175 +0000 UTC m=+36.638889759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:01.831756 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:01.831571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.831562575 +0000 UTC m=+36.638919158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:02.000221 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:02.000186 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cd95ab3-6a59-416f-8237-2554fc18b54f" containerID="a866683d6be9cc6a63790589d462a5a2aa3fd7c7bd18b053e73f16c8b7d9acab" exitCode=0 Apr 24 21:27:02.000374 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:02.000246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerDied","Data":"a866683d6be9cc6a63790589d462a5a2aa3fd7c7bd18b053e73f16c8b7d9acab"} Apr 24 21:27:03.005019 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:03.004988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-glnmw" event={"ID":"3cd95ab3-6a59-416f-8237-2554fc18b54f","Type":"ContainerStarted","Data":"a039a943ebaaaa140aa6b205e0f5316b375c39243d12e8687e4d5a41a663dcb9"} Apr 24 21:27:03.844331 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:03.844294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:03.844522 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:03.844340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:03.844522 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:03.844439 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:03.844522 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:03.844502 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.844486687 +0000 UTC m=+40.651843275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:03.844628 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:03.844445 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:03.844628 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:03.844574 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.844561665 +0000 UTC m=+40.651918247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:07.871346 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:07.871310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:07.871758 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:07.871382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:07.871758 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:07.871465 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:07.871758 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:07.871476 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:07.871758 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:07.871513 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.871499534 +0000 UTC m=+48.678856117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:07.871758 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:07.871536 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.871519322 +0000 UTC m=+48.678875909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:11.697754 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:11.697711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:27:11.700992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:11.700971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2b28d774-e7a8-450d-9ac2-f68dc752098e-original-pull-secret\") pod \"global-pull-secret-syncer-9fjb6\" (UID: \"2b28d774-e7a8-450d-9ac2-f68dc752098e\") " pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:27:12.000649 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:12.000562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9fjb6" Apr 24 21:27:12.173538 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:12.173480 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-glnmw" podStartSLOduration=15.091439375 podStartE2EDuration="45.173464288s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.441541016 +0000 UTC m=+3.248897601" lastFinishedPulling="2026-04-24 21:27:00.523565931 +0000 UTC m=+33.330922514" observedRunningTime="2026-04-24 21:27:03.043158135 +0000 UTC m=+35.850514762" watchObservedRunningTime="2026-04-24 21:27:12.173464288 +0000 UTC m=+44.980820904" Apr 24 21:27:12.174649 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:12.174621 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9fjb6"] Apr 24 21:27:13.026636 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:13.026596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9fjb6" event={"ID":"2b28d774-e7a8-450d-9ac2-f68dc752098e","Type":"ContainerStarted","Data":"29a7f6373265444c2d23cf8c5aaba3d0a9a33d7a6df01c03841da7b403411fa1"} Apr 24 21:27:15.927582 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:15.927545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:15.928034 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:15.927591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:15.928034 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:15.927722 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:15.928034 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:15.927787 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.927773423 +0000 UTC m=+64.735130006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:15.928034 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:15.927722 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:15.928034 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:15.927887 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.927872917 +0000 UTC m=+64.735229505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:17.035144 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:17.035109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9fjb6" event={"ID":"2b28d774-e7a8-450d-9ac2-f68dc752098e","Type":"ContainerStarted","Data":"8c07783ecb3448337701fb43a0b756bfb1d969d22ccfb154f89c542834264a8d"} Apr 24 21:27:17.062569 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:17.062526 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9fjb6" podStartSLOduration=34.296687298 podStartE2EDuration="38.062512407s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="2026-04-24 21:27:12.180196514 +0000 UTC m=+44.987553097" lastFinishedPulling="2026-04-24 21:27:15.946021623 +0000 UTC m=+48.753378206" observedRunningTime="2026-04-24 21:27:17.06131916 +0000 UTC m=+49.868675761" watchObservedRunningTime="2026-04-24 21:27:17.062512407 +0000 UTC m=+49.869869011" Apr 24 21:27:25.160551 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.160516 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw"] Apr 24 21:27:25.164883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.164867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.170980 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.170953 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:25.171079 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.170962 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9ff9q\"" Apr 24 21:27:25.172009 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.171989 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:27:25.172117 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.172027 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.172117 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.172047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.176801 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.176783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw"] Apr 24 21:27:25.298036 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.298006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjdb\" (UniqueName: \"kubernetes.io/projected/c33a7042-d798-4d57-b607-aa47c94e184a-kube-api-access-cdjdb\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.298164 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.298050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c33a7042-d798-4d57-b607-aa47c94e184a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.399124 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.399091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjdb\" (UniqueName: \"kubernetes.io/projected/c33a7042-d798-4d57-b607-aa47c94e184a-kube-api-access-cdjdb\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.399289 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.399135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c33a7042-d798-4d57-b607-aa47c94e184a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.401482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.401462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c33a7042-d798-4d57-b607-aa47c94e184a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.408673 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.408647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjdb\" (UniqueName: \"kubernetes.io/projected/c33a7042-d798-4d57-b607-aa47c94e184a-kube-api-access-cdjdb\") pod \"managed-serviceaccount-addon-agent-657cbf6d59-vcsdw\" (UID: \"c33a7042-d798-4d57-b607-aa47c94e184a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.484695 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.484623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" Apr 24 21:27:25.611201 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.611166 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw"] Apr 24 21:27:25.615393 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:27:25.615366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc33a7042_d798_4d57_b607_aa47c94e184a.slice/crio-e2d58bf407b45ddbe9d86d0adc5bd218817a71a5de2d641411b9e7e89d3bd8a7 WatchSource:0}: Error finding container e2d58bf407b45ddbe9d86d0adc5bd218817a71a5de2d641411b9e7e89d3bd8a7: Status 404 returned error can't find the container with id e2d58bf407b45ddbe9d86d0adc5bd218817a71a5de2d641411b9e7e89d3bd8a7 Apr 24 21:27:25.994632 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:25.994601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9dbf" Apr 24 21:27:26.052253 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:26.052222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" event={"ID":"c33a7042-d798-4d57-b607-aa47c94e184a","Type":"ContainerStarted","Data":"e2d58bf407b45ddbe9d86d0adc5bd218817a71a5de2d641411b9e7e89d3bd8a7"} Apr 24 21:27:29.059147 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:29.059107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" event={"ID":"c33a7042-d798-4d57-b607-aa47c94e184a","Type":"ContainerStarted","Data":"b4ae6373816343c13551f41e3939315326127e26488d8e813edcd41734c9bb93"} Apr 24 21:27:31.949162 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:31.949122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:27:31.949162 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:31.949167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:27:31.949669 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:31.949280 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:31.949669 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:31.949287 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:31.949669 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:31.949385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:03.949363218 +0000 UTC m=+96.756719806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:27:31.949669 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:31.949408 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:03.949397416 +0000 UTC m=+96.756754003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:27:33.459883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.459817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:27:33.462347 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.462330 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:33.470266 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:33.470243 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:33.470351 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:27:33.470342 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:37.470327217 +0000 UTC m=+130.277683799 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : secret "metrics-daemon-secret" not found Apr 24 21:27:33.561173 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.561141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:33.563928 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.563911 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:33.574126 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.574107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:33.585493 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.585468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9fp\" (UniqueName: \"kubernetes.io/projected/a6bcdb22-0356-4540-8553-9a968d14ba41-kube-api-access-nn9fp\") pod \"network-check-target-lqj24\" (UID: \"a6bcdb22-0356-4540-8553-9a968d14ba41\") " pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:33.607548 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.607526 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9whs\"" Apr 24 21:27:33.615352 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.615337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:33.722157 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.722069 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" podStartSLOduration=6.248833677 podStartE2EDuration="8.722053285s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:25.617242435 +0000 UTC m=+58.424599019" lastFinishedPulling="2026-04-24 21:27:28.090462027 +0000 UTC m=+60.897818627" observedRunningTime="2026-04-24 21:27:29.080607416 +0000 UTC m=+61.887964021" watchObservedRunningTime="2026-04-24 21:27:33.722053285 +0000 UTC m=+66.529409911" Apr 24 21:27:33.722299 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:33.722180 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lqj24"] Apr 24 21:27:33.725287 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:27:33.725262 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6bcdb22_0356_4540_8553_9a968d14ba41.slice/crio-30ad31a610ca04292edf74cec4f1cca4078b8924d2e0c4b1f913c566e2601f99 WatchSource:0}: Error finding container 30ad31a610ca04292edf74cec4f1cca4078b8924d2e0c4b1f913c566e2601f99: Status 404 returned error can't find the container with id 30ad31a610ca04292edf74cec4f1cca4078b8924d2e0c4b1f913c566e2601f99 Apr 24 21:27:34.068026 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:34.067939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lqj24" event={"ID":"a6bcdb22-0356-4540-8553-9a968d14ba41","Type":"ContainerStarted","Data":"30ad31a610ca04292edf74cec4f1cca4078b8924d2e0c4b1f913c566e2601f99"} Apr 24 21:27:37.075082 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:37.075042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lqj24" event={"ID":"a6bcdb22-0356-4540-8553-9a968d14ba41","Type":"ContainerStarted","Data":"cfd7c5fa684983fed04a8353a3e6aeaf2425fd039e07c0b2c136af86676cebad"} Apr 24 21:27:37.075477 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:37.075195 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:27:37.095932 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:27:37.095864 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lqj24" podStartSLOduration=67.438019443 podStartE2EDuration="1m10.095849361s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:33.727244275 +0000 UTC m=+66.534600859" lastFinishedPulling="2026-04-24 21:27:36.38507419 +0000 UTC m=+69.192430777" observedRunningTime="2026-04-24 21:27:37.095131642 +0000 UTC m=+69.902488243" watchObservedRunningTime="2026-04-24 21:27:37.095849361 +0000 UTC m=+69.903205959" Apr 24 21:28:03.978472 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:03.978430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:28:03.978878 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:03.978481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:28:03.978878 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:03.978566 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:03.978878 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:03.978585 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:03.978878 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:03.978645 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls podName:258d3fbf-bfd8-408b-9638-e130192183f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:07.978622901 +0000 UTC m=+160.785979500 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls") pod "dns-default-w6kmm" (UID: "258d3fbf-bfd8-408b-9638-e130192183f7") : secret "dns-default-metrics-tls" not found Apr 24 21:28:03.978878 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:03.978664 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:07.97865597 +0000 UTC m=+160.786012553 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:28:08.079763 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:08.079729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lqj24" Apr 24 21:28:37.512459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:37.512420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:28:37.512991 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:37.512554 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:37.512991 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:37.512623 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs podName:8ed80245-164d-4d1c-8ed3-05523db4cd57 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:39.512608245 +0000 UTC m=+252.319964831 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs") pod "network-metrics-daemon-9csmp" (UID: "8ed80245-164d-4d1c-8ed3-05523db4cd57") : secret "metrics-daemon-secret" not found Apr 24 21:28:54.071553 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.071518 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68fd45549b-hj7lt"] Apr 24 21:28:54.074324 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.074305 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.076359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.076337 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:28:54.076677 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.076656 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:28:54.076747 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.076664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gbf82\"" Apr 24 21:28:54.076936 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.076923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.077386 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.077369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:28:54.077489 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.077373 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.077546 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.077492 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:28:54.084769 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.084749 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68fd45549b-hj7lt"] Apr 24 21:28:54.184737 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.184705 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zg8z7"] Apr 24 21:28:54.187736 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.187714 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl"] Apr 24 21:28:54.187893 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.187877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.190410 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.190392 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz"] Apr 24 21:28:54.190519 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.190495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.190579 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.190549 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.190740 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.190714 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-78tv6\"" Apr 24 21:28:54.191573 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.191553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:28:54.193053 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.192331 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.193053 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.192708 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:28:54.194297 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.194275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.194391 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.194304 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:28:54.194676 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.194652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vnt6x\"" Apr 24 21:28:54.195326 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.195306 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv"] Apr 24 21:28:54.196167 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.196147 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.197584 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.197562 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:28:54.197693 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.197660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.200280 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.200261 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" Apr 24 21:28:54.201136 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.201116 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:28:54.202808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.202280 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.202808 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.202512 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xm9vp\"" Apr 24 21:28:54.203018 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.202994 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.203095 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203065 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:28:54.203744 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203377 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:28:54.203744 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.203744 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203448 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-62zzs\"" Apr 24 21:28:54.203744 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203555 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl"] Apr 24 21:28:54.203744 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.203570 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.204484 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.204467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zg8z7"] Apr 24 21:28:54.205453 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.205431 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz"] Apr 24 21:28:54.219534 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.219490 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv"] Apr 24 21:28:54.231127 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.231102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.231245 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.231130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-stats-auth\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.231245 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.231148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.231245 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.231167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-default-certificate\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.231374 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.231248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwbd\" (UniqueName: \"kubernetes.io/projected/28741c17-0dca-4049-bd6e-23d87c1354b0-kube-api-access-xrwbd\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.332718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-service-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.332718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-stats-auth\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.332718 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332702 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghvg\" (UniqueName: \"kubernetes.io/projected/66268dd3-4212-4861-bf78-8f224b9e2ec4-kube-api-access-5ghvg\") pod \"volume-data-source-validator-7c6cbb6c87-7d4mv\" (UID: \"66268dd3-4212-4861-bf78-8f224b9e2ec4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae49ff-70be-49a5-864a-ffbc96166c41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2eb9d760-626d-4d98-9d3e-3f022ca09d78-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-default-certificate\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ae49ff-70be-49a5-864a-ffbc96166c41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwbd\" (UniqueName: \"kubernetes.io/projected/28741c17-0dca-4049-bd6e-23d87c1354b0-kube-api-access-xrwbd\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.332948 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:54.832928328 +0000 UTC m=+147.640284911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.332993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwrr8\" (UniqueName: \"kubernetes.io/projected/30ae49ff-70be-49a5-864a-ffbc96166c41-kube-api-access-dwrr8\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-tmp\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.333209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgp74\" (UniqueName: \"kubernetes.io/projected/2eb9d760-626d-4d98-9d3e-3f022ca09d78-kube-api-access-sgp74\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-snapshots\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8t5l\" (UniqueName: \"kubernetes.io/projected/2aa96248-3e79-4b6e-b5ab-600b84643235-kube-api-access-n8t5l\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa96248-3e79-4b6e-b5ab-600b84643235-serving-cert\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333395 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.333456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.333565 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:54.333631 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.333615 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:54.833598858 +0000 UTC m=+147.640955441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : secret "router-metrics-certs-default" not found Apr 24 21:28:54.335376 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.335357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-default-certificate\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.335427 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.335373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-stats-auth\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.341735 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.341715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwbd\" (UniqueName: \"kubernetes.io/projected/28741c17-0dca-4049-bd6e-23d87c1354b0-kube-api-access-xrwbd\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.434160 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-snapshots\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434160 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8t5l\" (UniqueName: \"kubernetes.io/projected/2aa96248-3e79-4b6e-b5ab-600b84643235-kube-api-access-n8t5l\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa96248-3e79-4b6e-b5ab-600b84643235-serving-cert\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.434420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-service-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434420 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghvg\" (UniqueName: \"kubernetes.io/projected/66268dd3-4212-4861-bf78-8f224b9e2ec4-kube-api-access-5ghvg\") pod \"volume-data-source-validator-7c6cbb6c87-7d4mv\" (UID: \"66268dd3-4212-4861-bf78-8f224b9e2ec4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" Apr 24 21:28:54.434651 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae49ff-70be-49a5-864a-ffbc96166c41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.434651 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2eb9d760-626d-4d98-9d3e-3f022ca09d78-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.434741 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ae49ff-70be-49a5-864a-ffbc96166c41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.434741 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwrr8\" (UniqueName: \"kubernetes.io/projected/30ae49ff-70be-49a5-864a-ffbc96166c41-kube-api-access-dwrr8\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.434880 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-tmp\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.434880 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.434788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgp74\" (UniqueName: \"kubernetes.io/projected/2eb9d760-626d-4d98-9d3e-3f022ca09d78-kube-api-access-sgp74\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.435082 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-snapshots\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.435158 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.434575 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:54.435211 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.435165 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:54.935145499 +0000 UTC m=+147.742502094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:54.435272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae49ff-70be-49a5-864a-ffbc96166c41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.435272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-service-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.435553 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa96248-3e79-4b6e-b5ab-600b84643235-tmp\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.435661 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2eb9d760-626d-4d98-9d3e-3f022ca09d78-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.435894 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.435877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa96248-3e79-4b6e-b5ab-600b84643235-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.436899 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.436876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa96248-3e79-4b6e-b5ab-600b84643235-serving-cert\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.437307 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.437281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ae49ff-70be-49a5-864a-ffbc96166c41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.444561 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.444532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8t5l\" (UniqueName: \"kubernetes.io/projected/2aa96248-3e79-4b6e-b5ab-600b84643235-kube-api-access-n8t5l\") pod \"insights-operator-585dfdc468-zg8z7\" (UID: \"2aa96248-3e79-4b6e-b5ab-600b84643235\") " pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.444674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.444609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgp74\" (UniqueName: \"kubernetes.io/projected/2eb9d760-626d-4d98-9d3e-3f022ca09d78-kube-api-access-sgp74\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.444674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.444640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwrr8\" (UniqueName: \"kubernetes.io/projected/30ae49ff-70be-49a5-864a-ffbc96166c41-kube-api-access-dwrr8\") pod \"kube-storage-version-migrator-operator-6769c5d45-nmnkl\" (UID: \"30ae49ff-70be-49a5-864a-ffbc96166c41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.444674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.444647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghvg\" (UniqueName: \"kubernetes.io/projected/66268dd3-4212-4861-bf78-8f224b9e2ec4-kube-api-access-5ghvg\") pod \"volume-data-source-validator-7c6cbb6c87-7d4mv\" (UID: \"66268dd3-4212-4861-bf78-8f224b9e2ec4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" Apr 24 21:28:54.498897 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.498861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" Apr 24 21:28:54.507645 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.507623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" Apr 24 21:28:54.518452 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.518426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" Apr 24 21:28:54.630543 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.630450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zg8z7"] Apr 24 21:28:54.633236 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:28:54.633204 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa96248_3e79_4b6e_b5ab_600b84643235.slice/crio-0f189c0c9f0afed5c816b9020e7da46876232272a758b29d7a92a318157a89d4 WatchSource:0}: Error finding container 0f189c0c9f0afed5c816b9020e7da46876232272a758b29d7a92a318157a89d4: Status 404 returned error can't find the container with id 0f189c0c9f0afed5c816b9020e7da46876232272a758b29d7a92a318157a89d4 Apr 24 21:28:54.646393 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.646372 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl"] Apr 24 21:28:54.649219 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:28:54.649195 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ae49ff_70be_49a5_864a_ffbc96166c41.slice/crio-4541a2ea89f4ac3ebe0053c391f4ac0d788805b643240a43898a83a5735edc71 WatchSource:0}: Error finding container 4541a2ea89f4ac3ebe0053c391f4ac0d788805b643240a43898a83a5735edc71: Status 404 returned error can't find the container with id 4541a2ea89f4ac3ebe0053c391f4ac0d788805b643240a43898a83a5735edc71 Apr 24 21:28:54.659917 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.659898 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv"] Apr 24 21:28:54.662557 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:28:54.662535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66268dd3_4212_4861_bf78_8f224b9e2ec4.slice/crio-6e4d9237e17b8f120833e3995dbcf987e9e3c7a2faeda220f5c7b16c9e6dc0c4 WatchSource:0}: Error finding container 6e4d9237e17b8f120833e3995dbcf987e9e3c7a2faeda220f5c7b16c9e6dc0c4: Status 404 returned error can't find the container with id 6e4d9237e17b8f120833e3995dbcf987e9e3c7a2faeda220f5c7b16c9e6dc0c4 Apr 24 21:28:54.839104 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.839007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.839257 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.839179 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:55.839160994 +0000 UTC m=+148.646517581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:54.839310 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.839259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:54.839352 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.839336 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:54.839390 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.839366 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:55.839358216 +0000 UTC m=+148.646714799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : secret "router-metrics-certs-default" not found Apr 24 21:28:54.939935 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:54.939900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:54.940112 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.940075 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:54.940186 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:54.940157 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:55.940134645 +0000 UTC m=+148.747491230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:55.226568 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.226526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" event={"ID":"66268dd3-4212-4861-bf78-8f224b9e2ec4","Type":"ContainerStarted","Data":"6e4d9237e17b8f120833e3995dbcf987e9e3c7a2faeda220f5c7b16c9e6dc0c4"} Apr 24 21:28:55.227781 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.227610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" event={"ID":"30ae49ff-70be-49a5-864a-ffbc96166c41","Type":"ContainerStarted","Data":"4541a2ea89f4ac3ebe0053c391f4ac0d788805b643240a43898a83a5735edc71"} Apr 24 21:28:55.229096 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.229069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" event={"ID":"2aa96248-3e79-4b6e-b5ab-600b84643235","Type":"ContainerStarted","Data":"0f189c0c9f0afed5c816b9020e7da46876232272a758b29d7a92a318157a89d4"} Apr 24 21:28:55.847185 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.847145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:55.847383 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.847212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:55.847669 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:55.847635 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:55.847787 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:55.847743 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:57.847707158 +0000 UTC m=+150.655063761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : secret "router-metrics-certs-default" not found Apr 24 21:28:55.848129 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:55.848094 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:57.848072241 +0000 UTC m=+150.655428826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:55.949243 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:55.949172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:55.949446 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:55.949340 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:55.949446 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:55.949415 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:57.949398664 +0000 UTC m=+150.756755246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:57.235679 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.235575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" event={"ID":"66268dd3-4212-4861-bf78-8f224b9e2ec4","Type":"ContainerStarted","Data":"24a82779bbdd9d808d6341deeeafe57cfe3a1b8a7ec4518f47008f735e9dfea0"} Apr 24 21:28:57.236773 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.236741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" event={"ID":"30ae49ff-70be-49a5-864a-ffbc96166c41","Type":"ContainerStarted","Data":"1969d222b321cf67114542f9c9cd48882fa5194ded4abebff425c2f5028757d7"} Apr 24 21:28:57.238061 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.238036 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" event={"ID":"2aa96248-3e79-4b6e-b5ab-600b84643235","Type":"ContainerStarted","Data":"52797d3eb35c1f025263b16cd6fc2007ca45886fbdffb061d693f744f8257e1d"} Apr 24 21:28:57.254255 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.254199 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7d4mv" podStartSLOduration=0.947300254 podStartE2EDuration="3.25418232s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.664205878 +0000 UTC m=+147.471562462" lastFinishedPulling="2026-04-24 21:28:56.971087941 +0000 UTC m=+149.778444528" observedRunningTime="2026-04-24 21:28:57.252906862 +0000 UTC m=+150.060263490" watchObservedRunningTime="2026-04-24 21:28:57.25418232 +0000 UTC m=+150.061538922" Apr 24 21:28:57.273129 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.273076 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" podStartSLOduration=0.934745223 podStartE2EDuration="3.273059874s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.635141263 +0000 UTC m=+147.442497849" lastFinishedPulling="2026-04-24 21:28:56.973455911 +0000 UTC m=+149.780812500" observedRunningTime="2026-04-24 21:28:57.271754373 +0000 UTC m=+150.079110981" watchObservedRunningTime="2026-04-24 21:28:57.273059874 +0000 UTC m=+150.080416481" Apr 24 21:28:57.288416 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.288363 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" podStartSLOduration=0.961403698 podStartE2EDuration="3.288346899s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.651039961 +0000 UTC m=+147.458396544" lastFinishedPulling="2026-04-24 21:28:56.977983151 +0000 UTC m=+149.785339745" observedRunningTime="2026-04-24 21:28:57.287998716 +0000 UTC m=+150.095355321" watchObservedRunningTime="2026-04-24 21:28:57.288346899 +0000 UTC m=+150.095703505" Apr 24 21:28:57.393390 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.393350 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:28:57.396882 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.396859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.400630 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.400597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:28:57.400791 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.400601 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:28:57.400791 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.400714 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:28:57.400791 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.400741 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvff9\"" Apr 24 21:28:57.406199 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.406178 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:28:57.412944 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.412922 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:28:57.562470 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562470 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562429 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94m6v\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.562961 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.562704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.663903 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.663867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.664095 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.663922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.664481 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.664445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.664583 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.664531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.665307 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.665278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.665760 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.665738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.666059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.666040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668366 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.668293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668472 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.668386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668529 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.668497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668576 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.668540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94m6v\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668623 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.668578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.668757 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.668737 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:57.668802 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.668763 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86f6c45576-mlt7l: secret "image-registry-tls" not found Apr 24 21:28:57.668882 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.668864 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls podName:61e99d32-d2fc-4fc0-bfbc-020aaca95305 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:58.168816407 +0000 UTC m=+150.976173000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls") pod "image-registry-86f6c45576-mlt7l" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305") : secret "image-registry-tls" not found Apr 24 21:28:57.672178 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.672150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.673058 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.673029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.679526 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.679502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94m6v\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:57.870260 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.870167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:57.870408 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.870259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:28:57.870408 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.870335 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:57.870408 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.870395 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.870379038 +0000 UTC m=+154.677735637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:57.870507 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.870413 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.870406528 +0000 UTC m=+154.677763111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : secret "router-metrics-certs-default" not found Apr 24 21:28:57.970933 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:57.970893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:28:57.971104 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.971049 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:57.971147 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:57.971113 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.971096396 +0000 UTC m=+154.778452979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:58.173583 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:58.173545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:58.173748 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:58.173690 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:58.173748 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:58.173709 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86f6c45576-mlt7l: secret "image-registry-tls" not found Apr 24 21:28:58.173821 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:58.173764 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls podName:61e99d32-d2fc-4fc0-bfbc-020aaca95305 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:59.173747144 +0000 UTC m=+151.981103726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls") pod "image-registry-86f6c45576-mlt7l" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305") : secret "image-registry-tls" not found Apr 24 21:28:59.181969 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:59.181908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:28:59.182365 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:59.182066 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:59.182365 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:59.182085 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86f6c45576-mlt7l: secret "image-registry-tls" not found Apr 24 21:28:59.182365 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:28:59.182144 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls podName:61e99d32-d2fc-4fc0-bfbc-020aaca95305 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.182127389 +0000 UTC m=+153.989483992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls") pod "image-registry-86f6c45576-mlt7l" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305") : secret "image-registry-tls" not found Apr 24 21:28:59.861248 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:28:59.861220 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5k6kg_00622ba2-e987-487b-870b-1558450fa114/dns-node-resolver/0.log" Apr 24 21:29:00.991750 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:00.991714 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz69t"] Apr 24 21:29:00.995968 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:00.995944 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:00.999691 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:00.999667 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:29:00.999801 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:00.999694 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:29:01.000652 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.000633 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:29:01.000726 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.000652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:29:01.000726 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.000666 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mw25l\"" Apr 24 21:29:01.007876 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.007855 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz69t"] Apr 24 21:29:01.063946 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.063916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l5t5z_8b42bf05-9792-4dd3-9486-e262d6b7afc8/node-ca/0.log" Apr 24 21:29:01.097435 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.097400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-cabundle\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.097603 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.097458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77nr2\" (UniqueName: \"kubernetes.io/projected/d2682ddb-6364-4fde-9a7e-c4574126ab0d-kube-api-access-77nr2\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.097603 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.097557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-key\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.198108 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.198063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-cabundle\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.198271 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.198217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77nr2\" (UniqueName: \"kubernetes.io/projected/d2682ddb-6364-4fde-9a7e-c4574126ab0d-kube-api-access-77nr2\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.198271 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.198252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:01.198356 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.198282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-key\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.198405 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.198368 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:01.198405 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.198385 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86f6c45576-mlt7l: secret "image-registry-tls" not found Apr 24 21:29:01.198500 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.198434 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls podName:61e99d32-d2fc-4fc0-bfbc-020aaca95305 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:05.198421011 +0000 UTC m=+158.005777593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls") pod "image-registry-86f6c45576-mlt7l" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305") : secret "image-registry-tls" not found Apr 24 21:29:01.198703 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.198685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-cabundle\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.200622 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.200598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d2682ddb-6364-4fde-9a7e-c4574126ab0d-signing-key\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.206199 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.206176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77nr2\" (UniqueName: \"kubernetes.io/projected/d2682ddb-6364-4fde-9a7e-c4574126ab0d-kube-api-access-77nr2\") pod \"service-ca-865cb79987-pz69t\" (UID: \"d2682ddb-6364-4fde-9a7e-c4574126ab0d\") " pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.304590 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.304522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pz69t" Apr 24 21:29:01.414595 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.414564 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz69t"] Apr 24 21:29:01.418151 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:01.418123 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2682ddb_6364_4fde_9a7e_c4574126ab0d.slice/crio-3edc2f26e2613475a29c3028f947011b826a4710f73eaebb7bbf5ea4e22b88f7 WatchSource:0}: Error finding container 3edc2f26e2613475a29c3028f947011b826a4710f73eaebb7bbf5ea4e22b88f7: Status 404 returned error can't find the container with id 3edc2f26e2613475a29c3028f947011b826a4710f73eaebb7bbf5ea4e22b88f7 Apr 24 21:29:01.904707 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.904670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:01.904915 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:01.904773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:01.904915 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.904869 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.904847665 +0000 UTC m=+162.712204249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:01.904915 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.904910 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:01.905090 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:01.904960 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.904947916 +0000 UTC m=+162.712304499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : secret "router-metrics-certs-default" not found Apr 24 21:29:02.005573 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:02.005533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:29:02.006018 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:02.005678 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:02.006018 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:02.005757 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:10.005736286 +0000 UTC m=+162.813092882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:02.253454 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:02.253353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pz69t" event={"ID":"d2682ddb-6364-4fde-9a7e-c4574126ab0d","Type":"ContainerStarted","Data":"3edc2f26e2613475a29c3028f947011b826a4710f73eaebb7bbf5ea4e22b88f7"} Apr 24 21:29:03.154029 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:03.153991 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-w6kmm" podUID="258d3fbf-bfd8-408b-9638-e130192183f7" Apr 24 21:29:03.158123 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:03.158092 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p9djn" podUID="07d12594-0cd4-4f7e-8c3a-c529a1051347" Apr 24 21:29:03.256930 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:03.256902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pz69t" event={"ID":"d2682ddb-6364-4fde-9a7e-c4574126ab0d","Type":"ContainerStarted","Data":"31f23e98198fe4071d71eb53ae0bef9218020979489a637fd485b034bc463a66"} Apr 24 21:29:03.257071 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:03.256938 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:29:03.257071 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:03.256969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:03.274503 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:03.274457 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pz69t" podStartSLOduration=1.507261557 podStartE2EDuration="3.274445405s" podCreationTimestamp="2026-04-24 21:29:00 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.420141023 +0000 UTC m=+154.227497606" lastFinishedPulling="2026-04-24 21:29:03.187324871 +0000 UTC m=+155.994681454" observedRunningTime="2026-04-24 21:29:03.273579997 +0000 UTC m=+156.080936612" watchObservedRunningTime="2026-04-24 21:29:03.274445405 +0000 UTC m=+156.081802009" Apr 24 21:29:04.795394 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:04.795342 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9csmp" podUID="8ed80245-164d-4d1c-8ed3-05523db4cd57" Apr 24 21:29:05.231538 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:05.231499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:05.231745 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:05.231654 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:05.231745 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:05.231678 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86f6c45576-mlt7l: secret "image-registry-tls" not found Apr 24 21:29:05.231906 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:05.231748 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls podName:61e99d32-d2fc-4fc0-bfbc-020aaca95305 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:13.231727576 +0000 UTC m=+166.039084159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls") pod "image-registry-86f6c45576-mlt7l" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305") : secret "image-registry-tls" not found Apr 24 21:29:08.052139 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.052101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:08.052139 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.052137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:29:08.052704 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:08.052232 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:08.052704 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:08.052274 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert podName:07d12594-0cd4-4f7e-8c3a-c529a1051347 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:10.052261635 +0000 UTC m=+282.859618218 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert") pod "ingress-canary-p9djn" (UID: "07d12594-0cd4-4f7e-8c3a-c529a1051347") : secret "canary-serving-cert" not found Apr 24 21:29:08.054446 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.054417 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/258d3fbf-bfd8-408b-9638-e130192183f7-metrics-tls\") pod \"dns-default-w6kmm\" (UID: \"258d3fbf-bfd8-408b-9638-e130192183f7\") " pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:08.059922 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.059899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:29:08.067484 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.067460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:08.191626 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.191591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w6kmm"] Apr 24 21:29:08.195333 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:08.195301 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258d3fbf_bfd8_408b_9638_e130192183f7.slice/crio-6bec484eb40f8e45c145cdcec7855fa252af379988cc53641d65589e99fcb5cc WatchSource:0}: Error finding container 6bec484eb40f8e45c145cdcec7855fa252af379988cc53641d65589e99fcb5cc: Status 404 returned error can't find the container with id 6bec484eb40f8e45c145cdcec7855fa252af379988cc53641d65589e99fcb5cc Apr 24 21:29:08.274780 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:08.274743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6kmm" event={"ID":"258d3fbf-bfd8-408b-9638-e130192183f7","Type":"ContainerStarted","Data":"6bec484eb40f8e45c145cdcec7855fa252af379988cc53641d65589e99fcb5cc"} Apr 24 21:29:09.968559 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:09.968501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:09.968952 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:09.968619 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:09.968952 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:09.968788 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle podName:28741c17-0dca-4049-bd6e-23d87c1354b0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:25.968765877 +0000 UTC m=+178.776122484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle") pod "router-default-68fd45549b-hj7lt" (UID: "28741c17-0dca-4049-bd6e-23d87c1354b0") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:09.970894 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:09.970871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28741c17-0dca-4049-bd6e-23d87c1354b0-metrics-certs\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:10.069297 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:10.069261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:29:10.069481 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:10.069408 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:10.069481 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:10.069468 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls podName:2eb9d760-626d-4d98-9d3e-3f022ca09d78 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:26.069451138 +0000 UTC m=+178.876807730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2rpxz" (UID: "2eb9d760-626d-4d98-9d3e-3f022ca09d78") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:10.281168 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:10.281089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6kmm" event={"ID":"258d3fbf-bfd8-408b-9638-e130192183f7","Type":"ContainerStarted","Data":"22ae0ded0ec3e03e5213d9c6ed8a2ad7da2620ea3fdf473322d5bccc122a5413"} Apr 24 21:29:10.281168 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:10.281125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6kmm" event={"ID":"258d3fbf-bfd8-408b-9638-e130192183f7","Type":"ContainerStarted","Data":"6268f9d6fdebe93b13c82cd471997ec60f944ff525021d79dc0aea7dac7584ab"} Apr 24 21:29:10.281336 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:10.281218 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:10.302123 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:10.302071 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w6kmm" podStartSLOduration=128.99066055 podStartE2EDuration="2m10.302054441s" podCreationTimestamp="2026-04-24 21:27:00 +0000 UTC" firstStartedPulling="2026-04-24 21:29:08.197298235 +0000 UTC m=+161.004654818" lastFinishedPulling="2026-04-24 21:29:09.508692125 +0000 UTC m=+162.316048709" observedRunningTime="2026-04-24 21:29:10.300659504 +0000 UTC m=+163.108016109" watchObservedRunningTime="2026-04-24 21:29:10.302054441 +0000 UTC m=+163.109411041" Apr 24 21:29:13.294963 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:13.294929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:13.297272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:13.297246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"image-registry-86f6c45576-mlt7l\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:13.308205 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:13.308176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:13.431411 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:13.431375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:29:13.434732 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:13.434695 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e99d32_d2fc_4fc0_bfbc_020aaca95305.slice/crio-232e554e7a592a72c00e42de066fe6658270ca2bb9d6623ba698a18447cb7c37 WatchSource:0}: Error finding container 232e554e7a592a72c00e42de066fe6658270ca2bb9d6623ba698a18447cb7c37: Status 404 returned error can't find the container with id 232e554e7a592a72c00e42de066fe6658270ca2bb9d6623ba698a18447cb7c37 Apr 24 21:29:14.293535 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:14.293503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" event={"ID":"61e99d32-d2fc-4fc0-bfbc-020aaca95305","Type":"ContainerStarted","Data":"ec5bd826d57dd542705e506a460047fa99943482d7d1ba9bdc37982ef69b4728"} Apr 24 21:29:14.293535 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:14.293537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" event={"ID":"61e99d32-d2fc-4fc0-bfbc-020aaca95305","Type":"ContainerStarted","Data":"232e554e7a592a72c00e42de066fe6658270ca2bb9d6623ba698a18447cb7c37"} Apr 24 21:29:14.293733 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:14.293645 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:14.316346 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:14.316305 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" podStartSLOduration=17.316284577 podStartE2EDuration="17.316284577s" podCreationTimestamp="2026-04-24 21:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:14.315159067 +0000 UTC m=+167.122515671" watchObservedRunningTime="2026-04-24 21:29:14.316284577 +0000 UTC m=+167.123641182" Apr 24 21:29:17.785534 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:17.785493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:29:20.285729 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:20.285701 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w6kmm" Apr 24 21:29:21.832994 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.832956 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:29:21.930636 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.930603 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hbtb6"] Apr 24 21:29:21.937648 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.937624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:21.942448 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.942426 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:21.942599 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.942460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:21.942599 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.942542 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdl4k\"" Apr 24 21:29:21.954257 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.954227 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hbtb6"] Apr 24 21:29:21.962266 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.962236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prs5\" (UniqueName: \"kubernetes.io/projected/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-api-access-8prs5\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:21.962418 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.962274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3916a983-0b17-4fc6-aaf4-6f315f378c9f-crio-socket\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:21.962418 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.962398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:21.962498 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.962445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3916a983-0b17-4fc6-aaf4-6f315f378c9f-data-volume\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:21.962498 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:21.962485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3916a983-0b17-4fc6-aaf4-6f315f378c9f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.063804 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3916a983-0b17-4fc6-aaf4-6f315f378c9f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.063804 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8prs5\" (UniqueName: \"kubernetes.io/projected/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-api-access-8prs5\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.063804 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3916a983-0b17-4fc6-aaf4-6f315f378c9f-crio-socket\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.064130 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.064130 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3916a983-0b17-4fc6-aaf4-6f315f378c9f-data-volume\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.064130 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.063980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3916a983-0b17-4fc6-aaf4-6f315f378c9f-crio-socket\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.064283 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.064222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3916a983-0b17-4fc6-aaf4-6f315f378c9f-data-volume\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.064560 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.064536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.066306 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.066284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3916a983-0b17-4fc6-aaf4-6f315f378c9f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.072865 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.072821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prs5\" (UniqueName: \"kubernetes.io/projected/3916a983-0b17-4fc6-aaf4-6f315f378c9f-kube-api-access-8prs5\") pod \"insights-runtime-extractor-hbtb6\" (UID: \"3916a983-0b17-4fc6-aaf4-6f315f378c9f\") " pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.246527 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.246497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hbtb6" Apr 24 21:29:22.397901 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:22.397880 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hbtb6"] Apr 24 21:29:22.399917 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:22.399886 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3916a983_0b17_4fc6_aaf4_6f315f378c9f.slice/crio-4f8e89d3935208d27082d6f7fa85423db123b408240fdde4c1b592f03e63cbba WatchSource:0}: Error finding container 4f8e89d3935208d27082d6f7fa85423db123b408240fdde4c1b592f03e63cbba: Status 404 returned error can't find the container with id 4f8e89d3935208d27082d6f7fa85423db123b408240fdde4c1b592f03e63cbba Apr 24 21:29:23.318640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:23.318554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hbtb6" event={"ID":"3916a983-0b17-4fc6-aaf4-6f315f378c9f","Type":"ContainerStarted","Data":"8dc0e265daa27451544d20001d19a649562b9c9c71a5628aad1350deb1c88332"} Apr 24 21:29:23.318640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:23.318587 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hbtb6" event={"ID":"3916a983-0b17-4fc6-aaf4-6f315f378c9f","Type":"ContainerStarted","Data":"f560ac1f2ca4362dd7ce870ffef8e828d0d6635aeaad1f0744134ec378f5b1f2"} Apr 24 21:29:23.318640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:23.318597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hbtb6" event={"ID":"3916a983-0b17-4fc6-aaf4-6f315f378c9f","Type":"ContainerStarted","Data":"4f8e89d3935208d27082d6f7fa85423db123b408240fdde4c1b592f03e63cbba"} Apr 24 21:29:25.325661 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:25.325623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hbtb6" event={"ID":"3916a983-0b17-4fc6-aaf4-6f315f378c9f","Type":"ContainerStarted","Data":"dfae766b595eafbc53078a29f25343ca2d918941840395445b440a1b6edaefe7"} Apr 24 21:29:25.346648 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:25.346604 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hbtb6" podStartSLOduration=2.373035318 podStartE2EDuration="4.346590417s" podCreationTimestamp="2026-04-24 21:29:21 +0000 UTC" firstStartedPulling="2026-04-24 21:29:22.464150606 +0000 UTC m=+175.271507202" lastFinishedPulling="2026-04-24 21:29:24.437705717 +0000 UTC m=+177.245062301" observedRunningTime="2026-04-24 21:29:25.345733462 +0000 UTC m=+178.153090079" watchObservedRunningTime="2026-04-24 21:29:25.346590417 +0000 UTC m=+178.153947022" Apr 24 21:29:25.992280 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:25.992246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:25.993722 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:25.993701 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28741c17-0dca-4049-bd6e-23d87c1354b0-service-ca-bundle\") pod \"router-default-68fd45549b-hj7lt\" (UID: \"28741c17-0dca-4049-bd6e-23d87c1354b0\") " pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:26.092977 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.092932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:29:26.095338 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.095317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eb9d760-626d-4d98-9d3e-3f022ca09d78-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2rpxz\" (UID: \"2eb9d760-626d-4d98-9d3e-3f022ca09d78\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:29:26.183650 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.183619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:26.297293 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.297263 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68fd45549b-hj7lt"] Apr 24 21:29:26.301051 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:26.301013 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28741c17_0dca_4049_bd6e_23d87c1354b0.slice/crio-568884760a92d3aae28929e59543fa65737994219cef80a1e2a91bd8b9e304bf WatchSource:0}: Error finding container 568884760a92d3aae28929e59543fa65737994219cef80a1e2a91bd8b9e304bf: Status 404 returned error can't find the container with id 568884760a92d3aae28929e59543fa65737994219cef80a1e2a91bd8b9e304bf Apr 24 21:29:26.313300 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.313278 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" Apr 24 21:29:26.328692 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.328664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68fd45549b-hj7lt" event={"ID":"28741c17-0dca-4049-bd6e-23d87c1354b0","Type":"ContainerStarted","Data":"568884760a92d3aae28929e59543fa65737994219cef80a1e2a91bd8b9e304bf"} Apr 24 21:29:26.433142 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:26.433112 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz"] Apr 24 21:29:26.436617 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:26.436590 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb9d760_626d_4d98_9d3e_3f022ca09d78.slice/crio-09c9714c335bdb8252c9e6fe6524fe3470b79abb8411a77eafbca5eaa4ae3244 WatchSource:0}: Error finding container 09c9714c335bdb8252c9e6fe6524fe3470b79abb8411a77eafbca5eaa4ae3244: Status 404 returned error can't find the container with id 09c9714c335bdb8252c9e6fe6524fe3470b79abb8411a77eafbca5eaa4ae3244 Apr 24 21:29:27.333247 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:27.333209 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68fd45549b-hj7lt" event={"ID":"28741c17-0dca-4049-bd6e-23d87c1354b0","Type":"ContainerStarted","Data":"1c04ce4a4d642c0371806ec0e530d28b47d9fb8eda799999fe284231c8e5a9f2"} Apr 24 21:29:27.334329 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:27.334303 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" event={"ID":"2eb9d760-626d-4d98-9d3e-3f022ca09d78","Type":"ContainerStarted","Data":"09c9714c335bdb8252c9e6fe6524fe3470b79abb8411a77eafbca5eaa4ae3244"} Apr 24 21:29:27.358426 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:27.358382 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68fd45549b-hj7lt" podStartSLOduration=33.358369559 podStartE2EDuration="33.358369559s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:27.358139683 +0000 UTC m=+180.165496289" watchObservedRunningTime="2026-04-24 21:29:27.358369559 +0000 UTC m=+180.165726174" Apr 24 21:29:28.184737 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.184706 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:28.187404 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.187384 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:28.337449 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.337422 2574 generic.go:358] "Generic (PLEG): container finished" podID="c33a7042-d798-4d57-b607-aa47c94e184a" containerID="b4ae6373816343c13551f41e3939315326127e26488d8e813edcd41734c9bb93" exitCode=255 Apr 24 21:29:28.337732 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.337489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" event={"ID":"c33a7042-d798-4d57-b607-aa47c94e184a","Type":"ContainerDied","Data":"b4ae6373816343c13551f41e3939315326127e26488d8e813edcd41734c9bb93"} Apr 24 21:29:28.337775 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.337739 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:28.339119 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.339098 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68fd45549b-hj7lt" Apr 24 21:29:28.343588 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:28.343571 2574 scope.go:117] "RemoveContainer" containerID="b4ae6373816343c13551f41e3939315326127e26488d8e813edcd41734c9bb93" Apr 24 21:29:29.342034 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:29.341989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-657cbf6d59-vcsdw" event={"ID":"c33a7042-d798-4d57-b607-aa47c94e184a","Type":"ContainerStarted","Data":"67a7e04517f2205cbf6cd74c5055fb2d208db401d951c4e28fb0513f29a93182"} Apr 24 21:29:29.343275 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:29.343248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" event={"ID":"2eb9d760-626d-4d98-9d3e-3f022ca09d78","Type":"ContainerStarted","Data":"37c2249416e83388418701db88c0d42f918240863f37a6185f70b0de42b2ea46"} Apr 24 21:29:29.377389 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:29.377344 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2rpxz" podStartSLOduration=33.486008637 podStartE2EDuration="35.377331194s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:29:26.438454593 +0000 UTC m=+179.245811180" lastFinishedPulling="2026-04-24 21:29:28.329777139 +0000 UTC m=+181.137133737" observedRunningTime="2026-04-24 21:29:29.376991019 +0000 UTC m=+182.184347615" watchObservedRunningTime="2026-04-24 21:29:29.377331194 +0000 UTC m=+182.184687798" Apr 24 21:29:31.838658 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:31.838633 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:32.908210 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.908172 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmqcd"] Apr 24 21:29:32.911544 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.911525 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:32.914040 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.914015 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:29:32.914040 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.914015 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:29:32.914224 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.914026 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-r7zr4\"" Apr 24 21:29:32.915009 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.914991 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:32.921248 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:32.921226 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmqcd"] Apr 24 21:29:33.045664 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.045622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4425w\" (UniqueName: \"kubernetes.io/projected/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-kube-api-access-4425w\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.045840 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.045712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.045840 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.045785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.045916 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.045842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.146269 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.146233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.146631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.146292 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.146631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.146321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.146631 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.146389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4425w\" (UniqueName: \"kubernetes.io/projected/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-kube-api-access-4425w\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.146631 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:33.146454 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 21:29:33.146631 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:33.146539 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls podName:7ce31d54-5e72-4ae6-b9c8-32c890858e6d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:33.64651707 +0000 UTC m=+186.453873654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-rmqcd" (UID: "7ce31d54-5e72-4ae6-b9c8-32c890858e6d") : secret "prometheus-operator-tls" not found Apr 24 21:29:33.147158 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.147128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.148786 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.148765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.155638 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.155615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4425w\" (UniqueName: \"kubernetes.io/projected/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-kube-api-access-4425w\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.651849 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.651796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.654137 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.654106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ce31d54-5e72-4ae6-b9c8-32c890858e6d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rmqcd\" (UID: \"7ce31d54-5e72-4ae6-b9c8-32c890858e6d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.820756 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.820718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" Apr 24 21:29:33.935234 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:33.935167 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rmqcd"] Apr 24 21:29:33.937845 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:33.937802 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce31d54_5e72_4ae6_b9c8_32c890858e6d.slice/crio-13fde221efc5d6dbb1b59baf0572783bf727c85f23230c4131724c5f4f123c8c WatchSource:0}: Error finding container 13fde221efc5d6dbb1b59baf0572783bf727c85f23230c4131724c5f4f123c8c: Status 404 returned error can't find the container with id 13fde221efc5d6dbb1b59baf0572783bf727c85f23230c4131724c5f4f123c8c Apr 24 21:29:34.358031 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:34.357932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" event={"ID":"7ce31d54-5e72-4ae6-b9c8-32c890858e6d","Type":"ContainerStarted","Data":"13fde221efc5d6dbb1b59baf0572783bf727c85f23230c4131724c5f4f123c8c"} Apr 24 21:29:35.363504 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:35.363464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" event={"ID":"7ce31d54-5e72-4ae6-b9c8-32c890858e6d","Type":"ContainerStarted","Data":"d30876d22510f01854d487d520887f445e927b4e206934c7bd52620e816f98f2"} Apr 24 21:29:35.363504 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:35.363508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" event={"ID":"7ce31d54-5e72-4ae6-b9c8-32c890858e6d","Type":"ContainerStarted","Data":"62b93d27694e6ee27b0672deb7d6a784e27ed39301d88082fa14eeaa4a5519e0"} Apr 24 21:29:35.380503 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:35.380451 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rmqcd" podStartSLOduration=2.156482299 podStartE2EDuration="3.380436259s" podCreationTimestamp="2026-04-24 21:29:32 +0000 UTC" firstStartedPulling="2026-04-24 21:29:33.939683687 +0000 UTC m=+186.747040270" lastFinishedPulling="2026-04-24 21:29:35.163637643 +0000 UTC m=+187.970994230" observedRunningTime="2026-04-24 21:29:35.380084285 +0000 UTC m=+188.187440890" watchObservedRunningTime="2026-04-24 21:29:35.380436259 +0000 UTC m=+188.187792863" Apr 24 21:29:37.298331 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.298300 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hzb5l"] Apr 24 21:29:37.301592 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.301562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.304122 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.304101 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:37.304259 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.304241 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:37.304331 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.304195 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:37.304384 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.304204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k4448\"" Apr 24 21:29:37.383151 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-node-exporter-accelerators-collector-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383324 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fpvm\" (UniqueName: \"kubernetes.io/projected/89365de9-d284-465e-a69f-20a06b192232-kube-api-access-9fpvm\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383324 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-sys\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89365de9-d284-465e-a69f-20a06b192232-node-exporter-textfile\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-metrics-client-ca\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383395 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-node-exporter-wtmp\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383648 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.383648 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.383564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-root\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.484467 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.484428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-root\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.484650 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.484490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-node-exporter-accelerators-collector-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.484650 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.484523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fpvm\" (UniqueName: \"kubernetes.io/projected/89365de9-d284-465e-a69f-20a06b192232-kube-api-access-9fpvm\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.484650 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.484552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-root\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-sys\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89365de9-d284-465e-a69f-20a06b192232-node-exporter-textfile\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-metrics-client-ca\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-node-exporter-wtmp\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:37.485603 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-node-exporter-accelerators-collector-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:29:37.485671 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls podName:89365de9-d284-465e-a69f-20a06b192232 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.985653117 +0000 UTC m=+190.793009708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls") pod "node-exporter-hzb5l" (UID: "89365de9-d284-465e-a69f-20a06b192232") : secret "node-exporter-tls" not found Apr 24 21:29:37.485706 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89365de9-d284-465e-a69f-20a06b192232-node-exporter-textfile\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.486336 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.485986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-node-exporter-wtmp\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.491240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.486520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89365de9-d284-465e-a69f-20a06b192232-sys\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.491240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.486574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89365de9-d284-465e-a69f-20a06b192232-metrics-client-ca\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.493656 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.493632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.497135 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.497115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fpvm\" (UniqueName: \"kubernetes.io/projected/89365de9-d284-465e-a69f-20a06b192232-kube-api-access-9fpvm\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.988778 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.988737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:37.991390 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:37.991361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89365de9-d284-465e-a69f-20a06b192232-node-exporter-tls\") pod \"node-exporter-hzb5l\" (UID: \"89365de9-d284-465e-a69f-20a06b192232\") " pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:38.210378 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:38.210343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hzb5l" Apr 24 21:29:38.218749 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:38.218719 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89365de9_d284_465e_a69f_20a06b192232.slice/crio-1e1f5a6db93d3bbf37a455932195aacb1a5c1f37948530ab3e44f5bc2df56cc5 WatchSource:0}: Error finding container 1e1f5a6db93d3bbf37a455932195aacb1a5c1f37948530ab3e44f5bc2df56cc5: Status 404 returned error can't find the container with id 1e1f5a6db93d3bbf37a455932195aacb1a5c1f37948530ab3e44f5bc2df56cc5 Apr 24 21:29:38.373588 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:38.373509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hzb5l" event={"ID":"89365de9-d284-465e-a69f-20a06b192232","Type":"ContainerStarted","Data":"1e1f5a6db93d3bbf37a455932195aacb1a5c1f37948530ab3e44f5bc2df56cc5"} Apr 24 21:29:39.377357 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:39.377318 2574 generic.go:358] "Generic (PLEG): container finished" podID="89365de9-d284-465e-a69f-20a06b192232" containerID="0ffc344896a5380c2700640e1ba72c9313425cd284faa67da6c8690092d468dc" exitCode=0 Apr 24 21:29:39.377854 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:39.377369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hzb5l" event={"ID":"89365de9-d284-465e-a69f-20a06b192232","Type":"ContainerDied","Data":"0ffc344896a5380c2700640e1ba72c9313425cd284faa67da6c8690092d468dc"} Apr 24 21:29:40.383100 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:40.383061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hzb5l" event={"ID":"89365de9-d284-465e-a69f-20a06b192232","Type":"ContainerStarted","Data":"05ccb139e335ee1235b17db11f46c2a043e657d4c67487d655af44c2a29cb1b5"} Apr 24 21:29:40.383100 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:40.383105 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hzb5l" event={"ID":"89365de9-d284-465e-a69f-20a06b192232","Type":"ContainerStarted","Data":"0877f4a07d2392c2ba5193944d5cba74e4125c6665e4049ba4de42a9c5c8c528"} Apr 24 21:29:40.426161 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:40.426107 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hzb5l" podStartSLOduration=2.764007484 podStartE2EDuration="3.426090261s" podCreationTimestamp="2026-04-24 21:29:37 +0000 UTC" firstStartedPulling="2026-04-24 21:29:38.220352443 +0000 UTC m=+191.027709026" lastFinishedPulling="2026-04-24 21:29:38.88243522 +0000 UTC m=+191.689791803" observedRunningTime="2026-04-24 21:29:40.424229327 +0000 UTC m=+193.231585932" watchObservedRunningTime="2026-04-24 21:29:40.426090261 +0000 UTC m=+193.233446866" Apr 24 21:29:42.157107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.157067 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8822h"] Apr 24 21:29:42.160464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.160443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:42.169922 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.169898 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-vlhnb\"" Apr 24 21:29:42.170147 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.170133 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:29:42.180085 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.180061 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8822h"] Apr 24 21:29:42.326688 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.326654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3436e07c-cd57-497e-ab5a-9e3d447d77f8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8822h\" (UID: \"3436e07c-cd57-497e-ab5a-9e3d447d77f8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:42.427617 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.427534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3436e07c-cd57-497e-ab5a-9e3d447d77f8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8822h\" (UID: \"3436e07c-cd57-497e-ab5a-9e3d447d77f8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:42.429974 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.429942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3436e07c-cd57-497e-ab5a-9e3d447d77f8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8822h\" (UID: \"3436e07c-cd57-497e-ab5a-9e3d447d77f8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:42.468865 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.468814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:42.588025 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:42.587857 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8822h"] Apr 24 21:29:42.590495 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:29:42.590470 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3436e07c_cd57_497e_ab5a_9e3d447d77f8.slice/crio-80a9fa0ab193b2f2c22fd4d81c04a913431d2143c68e577b7333536a8220881d WatchSource:0}: Error finding container 80a9fa0ab193b2f2c22fd4d81c04a913431d2143c68e577b7333536a8220881d: Status 404 returned error can't find the container with id 80a9fa0ab193b2f2c22fd4d81c04a913431d2143c68e577b7333536a8220881d Apr 24 21:29:43.393058 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.393016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" event={"ID":"3436e07c-cd57-497e-ab5a-9e3d447d77f8","Type":"ContainerStarted","Data":"80a9fa0ab193b2f2c22fd4d81c04a913431d2143c68e577b7333536a8220881d"} Apr 24 21:29:43.621526 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.621488 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:43.625931 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.625905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.628391 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.628331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:29:43.628391 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.628331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:29:43.628714 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.628698 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:29:43.629135 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629000 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:29:43.629135 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629013 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:29:43.629297 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629240 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:29:43.629566 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:29:43.629566 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:29:43.630258 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629769 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7me3v24kfnfqf\"" Apr 24 21:29:43.630258 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.629903 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:29:43.630258 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.630151 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:29:43.630460 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.630364 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:29:43.630652 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.630557 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4w8vk\"" Apr 24 21:29:43.632670 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.632335 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:29:43.632670 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.632506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:29:43.637775 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.637735 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:43.739732 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.739949 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739933 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.739986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740030 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740285 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740589 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8cjz\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740589 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740589 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740589 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740453 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.740589 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.740477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.841849 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.841774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.841849 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.841855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.841901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.841927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.841953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8cjz\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842464 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.842915 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.842596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.843446 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.843043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.844229 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.844062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.845548 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.845518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.846167 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.846044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.846167 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.846079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.846167 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.846100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.846550 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.846523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.846781 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.846758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.847311 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.847280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.848191 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.848130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.848191 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.848137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.848669 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.848618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.848911 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.848723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.848911 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.848737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.849550 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.849527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.849972 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.849940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.852421 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.852397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8cjz\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz\") pod \"prometheus-k8s-0\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:43.942743 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:43.942707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:44.094701 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.094620 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:44.396416 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.396325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" event={"ID":"3436e07c-cd57-497e-ab5a-9e3d447d77f8","Type":"ContainerStarted","Data":"bf0bb4a3bb10f71be5bd22e938332bed572c4927bed4d9a3841015b9c4151991"} Apr 24 21:29:44.396863 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.396555 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:44.397478 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.397451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"cbaf0a0b5273c7f18c7e9556dd1b6933053874d6d04cd4b44ae4d59da737b16d"} Apr 24 21:29:44.401553 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.401530 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" Apr 24 21:29:44.413411 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:44.413372 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8822h" podStartSLOduration=1.123918998 podStartE2EDuration="2.413361713s" podCreationTimestamp="2026-04-24 21:29:42 +0000 UTC" firstStartedPulling="2026-04-24 21:29:42.59245509 +0000 UTC m=+195.399811673" lastFinishedPulling="2026-04-24 21:29:43.881897792 +0000 UTC m=+196.689254388" observedRunningTime="2026-04-24 21:29:44.41264689 +0000 UTC m=+197.220003515" watchObservedRunningTime="2026-04-24 21:29:44.413361713 +0000 UTC m=+197.220718372" Apr 24 21:29:45.401145 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:45.401111 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" exitCode=0 Apr 24 21:29:45.401529 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:45.401205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} Apr 24 21:29:46.852975 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:46.852913 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" podUID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" containerName="registry" containerID="cri-o://ec5bd826d57dd542705e506a460047fa99943482d7d1ba9bdc37982ef69b4728" gracePeriod=30 Apr 24 21:29:47.413363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:47.413310 2574 generic.go:358] "Generic (PLEG): container finished" podID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" containerID="ec5bd826d57dd542705e506a460047fa99943482d7d1ba9bdc37982ef69b4728" exitCode=0 Apr 24 21:29:47.413363 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:47.413365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" event={"ID":"61e99d32-d2fc-4fc0-bfbc-020aaca95305","Type":"ContainerDied","Data":"ec5bd826d57dd542705e506a460047fa99943482d7d1ba9bdc37982ef69b4728"} Apr 24 21:29:48.021472 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.021450 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:48.184517 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184494 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184638 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184547 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184638 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184628 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184749 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184663 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94m6v\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184749 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184739 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184877 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184800 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.184877 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184868 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.185039 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.184896 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") pod \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\" (UID: \"61e99d32-d2fc-4fc0-bfbc-020aaca95305\") " Apr 24 21:29:48.185197 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.185082 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:48.185294 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.185265 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-trusted-ca\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.185548 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.185515 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:48.188510 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.188465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:48.188614 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.188521 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v" (OuterVolumeSpecName: "kube-api-access-94m6v") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "kube-api-access-94m6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:48.188614 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.188528 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:48.188972 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.188948 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:48.189050 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.188990 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:48.195406 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.195382 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "61e99d32-d2fc-4fc0-bfbc-020aaca95305" (UID: "61e99d32-d2fc-4fc0-bfbc-020aaca95305"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:48.286240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286147 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94m6v\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-kube-api-access-94m6v\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286221 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-certificates\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286240 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286239 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-installation-pull-secrets\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286513 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286261 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-bound-sa-token\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286513 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286277 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61e99d32-d2fc-4fc0-bfbc-020aaca95305-registry-tls\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286513 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286291 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61e99d32-d2fc-4fc0-bfbc-020aaca95305-image-registry-private-configuration\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.286513 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.286306 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61e99d32-d2fc-4fc0-bfbc-020aaca95305-ca-trust-extracted\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:29:48.419274 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.419235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} Apr 24 21:29:48.419274 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.419280 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} Apr 24 21:29:48.420360 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.420330 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" event={"ID":"61e99d32-d2fc-4fc0-bfbc-020aaca95305","Type":"ContainerDied","Data":"232e554e7a592a72c00e42de066fe6658270ca2bb9d6623ba698a18447cb7c37"} Apr 24 21:29:48.420511 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.420375 2574 scope.go:117] "RemoveContainer" containerID="ec5bd826d57dd542705e506a460047fa99943482d7d1ba9bdc37982ef69b4728" Apr 24 21:29:48.420511 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.420373 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f6c45576-mlt7l" Apr 24 21:29:48.444560 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.444508 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:29:48.448848 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:48.448805 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86f6c45576-mlt7l"] Apr 24 21:29:49.788299 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:49.788266 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" path="/var/lib/kubelet/pods/61e99d32-d2fc-4fc0-bfbc-020aaca95305/volumes" Apr 24 21:29:50.431993 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:50.431956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} Apr 24 21:29:50.431993 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:50.431992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} Apr 24 21:29:50.431993 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:50.432001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} Apr 24 21:29:50.432241 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:50.432011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerStarted","Data":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} Apr 24 21:29:50.466170 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:50.466115 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.763615652 podStartE2EDuration="7.466101094s" podCreationTimestamp="2026-04-24 21:29:43 +0000 UTC" firstStartedPulling="2026-04-24 21:29:44.095169732 +0000 UTC m=+196.902526315" lastFinishedPulling="2026-04-24 21:29:49.797655172 +0000 UTC m=+202.605011757" observedRunningTime="2026-04-24 21:29:50.464262004 +0000 UTC m=+203.271618609" watchObservedRunningTime="2026-04-24 21:29:50.466101094 +0000 UTC m=+203.273457698" Apr 24 21:29:53.943665 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:29:53.943634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:18.505445 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:18.505409 2574 generic.go:358] "Generic (PLEG): container finished" podID="2aa96248-3e79-4b6e-b5ab-600b84643235" containerID="52797d3eb35c1f025263b16cd6fc2007ca45886fbdffb061d693f744f8257e1d" exitCode=0 Apr 24 21:30:18.505445 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:18.505447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" event={"ID":"2aa96248-3e79-4b6e-b5ab-600b84643235","Type":"ContainerDied","Data":"52797d3eb35c1f025263b16cd6fc2007ca45886fbdffb061d693f744f8257e1d"} Apr 24 21:30:18.505982 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:18.505716 2574 scope.go:117] "RemoveContainer" containerID="52797d3eb35c1f025263b16cd6fc2007ca45886fbdffb061d693f744f8257e1d" Apr 24 21:30:19.509880 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:19.509846 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zg8z7" event={"ID":"2aa96248-3e79-4b6e-b5ab-600b84643235","Type":"ContainerStarted","Data":"6ffec403cd926ac742fb0377bca970aa696fe1577aca80d7022f0800eb7f5e81"} Apr 24 21:30:20.269810 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:20.269778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-2rpxz_2eb9d760-626d-4d98-9d3e-3f022ca09d78/cluster-monitoring-operator/0.log" Apr 24 21:30:21.268565 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:21.268538 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8822h_3436e07c-cd57-497e-ab5a-9e3d447d77f8/monitoring-plugin/0.log" Apr 24 21:30:21.481749 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:21.481722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/init-textfile/0.log" Apr 24 21:30:21.669964 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:21.669929 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/node-exporter/0.log" Apr 24 21:30:21.873471 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:21.873443 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/kube-rbac-proxy/0.log" Apr 24 21:30:23.526148 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:23.526112 2574 generic.go:358] "Generic (PLEG): container finished" podID="30ae49ff-70be-49a5-864a-ffbc96166c41" containerID="1969d222b321cf67114542f9c9cd48882fa5194ded4abebff425c2f5028757d7" exitCode=0 Apr 24 21:30:23.526595 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:23.526184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" event={"ID":"30ae49ff-70be-49a5-864a-ffbc96166c41","Type":"ContainerDied","Data":"1969d222b321cf67114542f9c9cd48882fa5194ded4abebff425c2f5028757d7"} Apr 24 21:30:23.526595 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:23.526520 2574 scope.go:117] "RemoveContainer" containerID="1969d222b321cf67114542f9c9cd48882fa5194ded4abebff425c2f5028757d7" Apr 24 21:30:23.874955 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:23.874879 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/init-config-reloader/0.log" Apr 24 21:30:24.081178 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.081149 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/prometheus/0.log" Apr 24 21:30:24.274647 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.274616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/config-reloader/0.log" Apr 24 21:30:24.474293 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.474268 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/thanos-sidecar/0.log" Apr 24 21:30:24.530427 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.530353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nmnkl" event={"ID":"30ae49ff-70be-49a5-864a-ffbc96166c41","Type":"ContainerStarted","Data":"2309a4f4f314d87e17f786045783ddec62a9e62716738de69045b045a155829f"} Apr 24 21:30:24.687742 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.687710 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/kube-rbac-proxy-web/0.log" Apr 24 21:30:24.870865 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:24.870767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/kube-rbac-proxy/0.log" Apr 24 21:30:25.075040 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:25.075014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d23c8fde-d4e1-4324-af2a-7ea9d32be994/kube-rbac-proxy-thanos/0.log" Apr 24 21:30:25.270643 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:25.270616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmqcd_7ce31d54-5e72-4ae6-b9c8-32c890858e6d/prometheus-operator/0.log" Apr 24 21:30:25.469472 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:25.469433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmqcd_7ce31d54-5e72-4ae6-b9c8-32c890858e6d/kube-rbac-proxy/0.log" Apr 24 21:30:39.594973 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:39.594941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:30:39.597245 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:39.597224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ed80245-164d-4d1c-8ed3-05523db4cd57-metrics-certs\") pod \"network-metrics-daemon-9csmp\" (UID: \"8ed80245-164d-4d1c-8ed3-05523db4cd57\") " pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:30:39.689214 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:39.689176 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:30:39.697594 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:39.697566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9csmp" Apr 24 21:30:39.819931 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:39.819903 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9csmp"] Apr 24 21:30:39.822737 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:30:39.822707 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ed80245_164d_4d1c_8ed3_05523db4cd57.slice/crio-5bc84a67ad1f73641770c7b64df459dab201aea36d2c6bd4836d29de687a6a12 WatchSource:0}: Error finding container 5bc84a67ad1f73641770c7b64df459dab201aea36d2c6bd4836d29de687a6a12: Status 404 returned error can't find the container with id 5bc84a67ad1f73641770c7b64df459dab201aea36d2c6bd4836d29de687a6a12 Apr 24 21:30:40.575571 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:40.575485 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9csmp" event={"ID":"8ed80245-164d-4d1c-8ed3-05523db4cd57","Type":"ContainerStarted","Data":"5bc84a67ad1f73641770c7b64df459dab201aea36d2c6bd4836d29de687a6a12"} Apr 24 21:30:41.584678 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:41.584643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9csmp" event={"ID":"8ed80245-164d-4d1c-8ed3-05523db4cd57","Type":"ContainerStarted","Data":"3961629959adb451a17d6d26c22bf2a8f8034c7937617823a2a3d48db61271cb"} Apr 24 21:30:41.584678 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:41.584683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9csmp" event={"ID":"8ed80245-164d-4d1c-8ed3-05523db4cd57","Type":"ContainerStarted","Data":"d3aa402a79fd3ea2db6a6d02f662dbdce81b81d0f7b05a0a4597123535c3884a"} Apr 24 21:30:41.603536 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:41.603492 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9csmp" podStartSLOduration=253.721363023 podStartE2EDuration="4m14.60347928s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:30:39.824547215 +0000 UTC m=+252.631903799" lastFinishedPulling="2026-04-24 21:30:40.706663473 +0000 UTC m=+253.514020056" observedRunningTime="2026-04-24 21:30:41.602163704 +0000 UTC m=+254.409520310" watchObservedRunningTime="2026-04-24 21:30:41.60347928 +0000 UTC m=+254.410835884" Apr 24 21:30:43.943000 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:43.942966 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:43.958265 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:43.958241 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:44.608098 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:30:44.608072 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.098991 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.098948 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:02.099614 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099557 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy" containerID="cri-o://f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" gracePeriod=600 Apr 24 21:31:02.099691 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099596 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" gracePeriod=600 Apr 24 21:31:02.099691 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099663 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-web" containerID="cri-o://4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" gracePeriod=600 Apr 24 21:31:02.099691 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099537 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="prometheus" containerID="cri-o://95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" gracePeriod=600 Apr 24 21:31:02.099847 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099596 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="config-reloader" containerID="cri-o://0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" gracePeriod=600 Apr 24 21:31:02.099847 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.099790 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="thanos-sidecar" containerID="cri-o://f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" gracePeriod=600 Apr 24 21:31:02.348142 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.348118 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.496734 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496694 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496753 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496786 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496851 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496881 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496915 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.496958 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496950 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497283 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.496979 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497381 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497353 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.497560 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497535 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.497642 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497588 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497700 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497640 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497700 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497674 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8cjz\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497803 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497702 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497803 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497737 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497803 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497780 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497811 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497859 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497892 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.497992 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.497922 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config\") pod \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\" (UID: \"d23c8fde-d4e1-4324-af2a-7ea9d32be994\") " Apr 24 21:31:02.498231 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.498209 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-metrics-client-ca\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.498293 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.498238 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.498581 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.498546 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.498668 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.498622 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:02.499723 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.499440 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.500075 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.499982 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:02.500075 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.500048 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out" (OuterVolumeSpecName: "config-out") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:02.500311 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.500280 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.500544 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.500518 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.500694 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.500661 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.500876 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.500847 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.501182 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.501159 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config" (OuterVolumeSpecName: "config") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.501678 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.501646 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz" (OuterVolumeSpecName: "kube-api-access-x8cjz") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "kube-api-access-x8cjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:02.501813 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.501795 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.501948 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.501866 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.502516 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.502499 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.502807 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.502789 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.511068 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.511047 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config" (OuterVolumeSpecName: "web-config") pod "d23c8fde-d4e1-4324-af2a-7ea9d32be994" (UID: "d23c8fde-d4e1-4324-af2a-7ea9d32be994"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.598640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598602 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598634 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598640 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598646 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598656 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598665 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598674 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-kube-rbac-proxy\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598685 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-tls-assets\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598693 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598701 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-config-out\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598709 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-web-config\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598717 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-grpc-tls\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598725 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-db\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598733 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8cjz\" (UniqueName: \"kubernetes.io/projected/d23c8fde-d4e1-4324-af2a-7ea9d32be994-kube-api-access-x8cjz\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598742 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598752 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d23c8fde-d4e1-4324-af2a-7ea9d32be994-secret-metrics-client-certs\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.598887 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.598761 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23c8fde-d4e1-4324-af2a-7ea9d32be994-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.646125 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646093 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" exitCode=0 Apr 24 21:31:02.646125 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646118 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" exitCode=0 Apr 24 21:31:02.646125 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646124 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" exitCode=0 Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646136 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" exitCode=0 Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646144 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" exitCode=0 Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646149 2574 generic.go:358] "Generic (PLEG): container finished" podID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" exitCode=0 Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646192 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646302 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d23c8fde-d4e1-4324-af2a-7ea9d32be994","Type":"ContainerDied","Data":"cbaf0a0b5273c7f18c7e9556dd1b6933053874d6d04cd4b44ae4d59da737b16d"} Apr 24 21:31:02.646359 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.646353 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.653434 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.653418 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.660426 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.660407 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.666860 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.666839 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.671438 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.671417 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:02.673633 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.673619 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.679201 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.679150 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:02.684327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.684308 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.691418 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.691396 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.697603 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.697586 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.697870 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.697819 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.697951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.697883 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.697951 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.697929 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.698205 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.698188 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.698261 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698211 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.698261 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698226 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.698431 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.698414 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.698492 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698440 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.698492 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698461 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.698686 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.698665 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.698723 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698692 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.698723 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698706 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.698932 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.698914 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.698998 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698937 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.698998 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.698951 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.699153 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.699137 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.699201 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699156 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.699201 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699168 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.699365 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:02.699351 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.699412 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699367 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.699412 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699377 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.699572 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699553 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.699616 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699573 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.699780 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699763 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.699869 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.699783 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.700039 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700022 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.700107 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700041 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.700245 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700227 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.700288 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700255 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.700451 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700434 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.700515 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700452 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.700656 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700640 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.700700 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700657 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.700864 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700845 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.700926 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.700864 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.701104 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701085 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.701150 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701104 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.701298 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701282 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.701334 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701298 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.701486 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701472 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.701525 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701488 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.701689 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701667 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.701689 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701687 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.701903 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701888 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.701903 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.701903 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.704110 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.702237 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.704110 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.702261 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.704400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.704343 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.704400 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.704370 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.705077 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705050 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.705174 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705085 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.705402 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705379 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.705496 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705404 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.705667 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705640 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.705734 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705668 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.705947 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705929 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.706037 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.705949 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.706075 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706060 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:02.706213 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706188 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.706257 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706214 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.706383 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706352 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="config-reloader" Apr 24 21:31:02.706383 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706369 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="config-reloader" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706398 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" containerName="registry" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706407 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" containerName="registry" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706418 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-web" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706427 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-web" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706428 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706443 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706465 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706474 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706483 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="init-config-reloader" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706492 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="init-config-reloader" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706500 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="prometheus" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706508 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="prometheus" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706516 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="thanos-sidecar" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706524 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="thanos-sidecar" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706533 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:02.706537 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706543 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706620 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="config-reloader" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706632 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706630 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706644 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706651 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706656 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="61e99d32-d2fc-4fc0-bfbc-020aaca95305" containerName="registry" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706665 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="thanos-sidecar" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706675 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="prometheus" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706683 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" containerName="kube-rbac-proxy-web" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706906 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.706926 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707171 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.707295 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707190 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.707883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707400 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.707883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707417 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.707883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707679 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.707883 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707696 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.708084 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707968 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.708084 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.707992 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.708234 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708213 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.708234 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708232 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.708437 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708412 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.708477 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708441 2574 scope.go:117] "RemoveContainer" containerID="d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0" Apr 24 21:31:02.708659 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708626 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0"} err="failed to get container status \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": rpc error: code = NotFound desc = could not find container \"d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0\": container with ID starting with d044468612a9dbd54f4ca1440822f363f8db16f82da3ab8004506bb80e5bdcc0 not found: ID does not exist" Apr 24 21:31:02.708659 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708653 2574 scope.go:117] "RemoveContainer" containerID="f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b" Apr 24 21:31:02.708933 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708909 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b"} err="failed to get container status \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": rpc error: code = NotFound desc = could not find container \"f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b\": container with ID starting with f60bcf88eca46efbe121d6f89be5b90b3d642e00e715fb9e514de026a6f39e5b not found: ID does not exist" Apr 24 21:31:02.708933 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.708932 2574 scope.go:117] "RemoveContainer" containerID="4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962" Apr 24 21:31:02.709184 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709158 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962"} err="failed to get container status \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": rpc error: code = NotFound desc = could not find container \"4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962\": container with ID starting with 4af71cad4643a86a6f558a70a111e496d0d70a61218f756301e2e3308b87f962 not found: ID does not exist" Apr 24 21:31:02.709272 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709187 2574 scope.go:117] "RemoveContainer" containerID="f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699" Apr 24 21:31:02.709466 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709447 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699"} err="failed to get container status \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": rpc error: code = NotFound desc = could not find container \"f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699\": container with ID starting with f03291481556ac7dee3f5ec2bc7de6ecee9127fe23ab42447c4cb59c2ca09699 not found: ID does not exist" Apr 24 21:31:02.709524 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709467 2574 scope.go:117] "RemoveContainer" containerID="0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5" Apr 24 21:31:02.709722 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709701 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5"} err="failed to get container status \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": rpc error: code = NotFound desc = could not find container \"0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5\": container with ID starting with 0b175ee88e0178e92daea382666a450ec7e66f61e787aff1910504f89ae4f9b5 not found: ID does not exist" Apr 24 21:31:02.709777 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709723 2574 scope.go:117] "RemoveContainer" containerID="95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5" Apr 24 21:31:02.709986 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709964 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5"} err="failed to get container status \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": rpc error: code = NotFound desc = could not find container \"95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5\": container with ID starting with 95b511460b8a60a10190ff480c2ac331c51a4470e8818b1abc5c8d5c362176f5 not found: ID does not exist" Apr 24 21:31:02.710045 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.709987 2574 scope.go:117] "RemoveContainer" containerID="6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5" Apr 24 21:31:02.710190 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.710170 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5"} err="failed to get container status \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": rpc error: code = NotFound desc = could not find container \"6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5\": container with ID starting with 6d0bbb0c3f098b2f03921ecf2658309ddd9fc7e33c3528ebc59da3e05a9d93b5 not found: ID does not exist" Apr 24 21:31:02.712670 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.712654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.715596 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.715576 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:31:02.715709 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.715580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:31:02.715709 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.715631 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:31:02.716034 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.715998 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:31:02.716034 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716010 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:31:02.716189 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716015 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:31:02.716189 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716110 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:31:02.716189 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716129 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:31:02.716339 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716320 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:31:02.716431 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716392 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7me3v24kfnfqf\"" Apr 24 21:31:02.716494 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716473 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:31:02.716653 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716637 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4w8vk\"" Apr 24 21:31:02.716737 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.716641 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:31:02.720459 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.720439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:31:02.725788 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.725768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:31:02.726427 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.726409 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:02.900821 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.900789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-web-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.900996 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.900856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.900996 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.900943 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.900996 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.900982 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901158 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901158 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901158 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901158 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901234 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config-out\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901327 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5dx\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-kube-api-access-xt5dx\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:02.901502 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:02.901435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.001783 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.001783 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config-out\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5dx\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-kube-api-access-xt5dx\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.001975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002059 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-web-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.002407 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.003207 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.003207 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.003207 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.002980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.003391 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.003254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.003638 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.003613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.004357 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.004330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.005053 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.005029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.005053 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.005043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.005197 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.005053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.006357 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.006332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.006444 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.006373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.006752 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.006718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-config-out\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.006880 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.006784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.006880 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.006796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.007472 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.007449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.007736 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.007717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.007771 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.007748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-web-config\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.008478 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.008461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.018685 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.018661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5dx\" (UniqueName: \"kubernetes.io/projected/5543b5e0-dff7-4ddc-9c06-a4d3f79d6427-kube-api-access-xt5dx\") pod \"prometheus-k8s-0\" (UID: \"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.023527 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.023507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.155320 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.155289 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:03.155883 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:31:03.155853 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5543b5e0_dff7_4ddc_9c06_a4d3f79d6427.slice/crio-799141bf11781871c4d51a4a4cb45b8271c392c37a1a00f13d0d63e72c2247b9 WatchSource:0}: Error finding container 799141bf11781871c4d51a4a4cb45b8271c392c37a1a00f13d0d63e72c2247b9: Status 404 returned error can't find the container with id 799141bf11781871c4d51a4a4cb45b8271c392c37a1a00f13d0d63e72c2247b9 Apr 24 21:31:03.651288 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.651255 2574 generic.go:358] "Generic (PLEG): container finished" podID="5543b5e0-dff7-4ddc-9c06-a4d3f79d6427" containerID="251c31bb178b881fad554cc7dff878e6d5d8fb5b19fcaf2cb9aab802d6dadc6c" exitCode=0 Apr 24 21:31:03.651448 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.651327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerDied","Data":"251c31bb178b881fad554cc7dff878e6d5d8fb5b19fcaf2cb9aab802d6dadc6c"} Apr 24 21:31:03.651448 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.651353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"799141bf11781871c4d51a4a4cb45b8271c392c37a1a00f13d0d63e72c2247b9"} Apr 24 21:31:03.788153 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:03.788118 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23c8fde-d4e1-4324-af2a-7ea9d32be994" path="/var/lib/kubelet/pods/d23c8fde-d4e1-4324-af2a-7ea9d32be994/volumes" Apr 24 21:31:04.657411 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"342277f653fd4c12ba56fc86033a98c0a898dfe6ee6220eb23195ebe5785ea80"} Apr 24 21:31:04.657411 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"0f57598bfdfa6a6434bc5a0fa5ae3e282fb8a0f9bded30698afac7fc12182c88"} Apr 24 21:31:04.657411 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"9b0768efc799182cfe76ffd6679409e6e4fa494efceb60c6afc71b99c6ab368b"} Apr 24 21:31:04.657868 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"ffa7c2657d3c17c1dce92c75152983aa9a8bf1dc1f0b73f69af3b9fbb5d349ec"} Apr 24 21:31:04.657868 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"32e73393c690b92695b2e409b58e1c70eb72805588ac47e5f47e0b7122203da3"} Apr 24 21:31:04.657868 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.657446 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5543b5e0-dff7-4ddc-9c06-a4d3f79d6427","Type":"ContainerStarted","Data":"924eafe9329e3dad694040dcfb283c185ad35c1d4dbcc53541b820917a32c8a9"} Apr 24 21:31:04.691643 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:04.691585 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.691566544 podStartE2EDuration="2.691566544s" podCreationTimestamp="2026-04-24 21:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:04.685607906 +0000 UTC m=+277.492964511" watchObservedRunningTime="2026-04-24 21:31:04.691566544 +0000 UTC m=+277.498923148" Apr 24 21:31:06.258387 ip-10-0-137-28 kubenswrapper[2574]: E0424 21:31:06.258333 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p9djn" podUID="07d12594-0cd4-4f7e-8c3a-c529a1051347" Apr 24 21:31:06.664511 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:06.664479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:31:08.023966 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:08.023924 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:10.060002 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.059964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:31:10.062364 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.062342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07d12594-0cd4-4f7e-8c3a-c529a1051347-cert\") pod \"ingress-canary-p9djn\" (UID: \"07d12594-0cd4-4f7e-8c3a-c529a1051347\") " pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:31:10.267613 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.267580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:31:10.275666 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.275641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9djn" Apr 24 21:31:10.397274 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.397246 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p9djn"] Apr 24 21:31:10.403128 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:31:10.401580 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d12594_0cd4_4f7e_8c3a_c529a1051347.slice/crio-6970266c76805fe0cfaea0fcef3e65a6f99f293ba2a60dca51e77ec9dcc37adf WatchSource:0}: Error finding container 6970266c76805fe0cfaea0fcef3e65a6f99f293ba2a60dca51e77ec9dcc37adf: Status 404 returned error can't find the container with id 6970266c76805fe0cfaea0fcef3e65a6f99f293ba2a60dca51e77ec9dcc37adf Apr 24 21:31:10.675209 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:10.675175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p9djn" event={"ID":"07d12594-0cd4-4f7e-8c3a-c529a1051347","Type":"ContainerStarted","Data":"6970266c76805fe0cfaea0fcef3e65a6f99f293ba2a60dca51e77ec9dcc37adf"} Apr 24 21:31:12.681405 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:12.681369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p9djn" event={"ID":"07d12594-0cd4-4f7e-8c3a-c529a1051347","Type":"ContainerStarted","Data":"3ba5ee1d0f4a342ca7e26350f1873db2ab88222e859a3bcfb713e7f51970bb0c"} Apr 24 21:31:27.695438 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:27.695409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:31:27.696037 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:27.696012 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:31:27.698298 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:31:27.698281 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:03.024092 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:32:03.024040 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:03.039347 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:32:03.039324 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:03.075452 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:32:03.075385 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p9djn" podStartSLOduration=301.600891424 podStartE2EDuration="5m3.075367281s" podCreationTimestamp="2026-04-24 21:27:00 +0000 UTC" firstStartedPulling="2026-04-24 21:31:10.404613029 +0000 UTC m=+283.211969617" lastFinishedPulling="2026-04-24 21:31:11.879088887 +0000 UTC m=+284.686445474" observedRunningTime="2026-04-24 21:31:12.712116718 +0000 UTC m=+285.519473333" watchObservedRunningTime="2026-04-24 21:32:03.075367281 +0000 UTC m=+335.882723887" Apr 24 21:32:03.837231 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:32:03.837202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:34:58.795266 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.795229 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-kvldg"] Apr 24 21:34:58.798493 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.798475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvldg" Apr 24 21:34:58.801026 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.801000 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:34:58.801136 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.801099 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:34:58.801635 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.801621 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4bgsb\"" Apr 24 21:34:58.801674 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.801635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:34:58.806772 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.806750 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kvldg"] Apr 24 21:34:58.913214 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:58.913171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgtb2\" (UniqueName: \"kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2\") pod \"s3-init-kvldg\" (UID: \"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd\") " pod="kserve/s3-init-kvldg" Apr 24 21:34:59.014356 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.014309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgtb2\" (UniqueName: \"kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2\") pod \"s3-init-kvldg\" (UID: \"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd\") " pod="kserve/s3-init-kvldg" Apr 24 21:34:59.022037 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.022015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgtb2\" (UniqueName: \"kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2\") pod \"s3-init-kvldg\" (UID: \"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd\") " pod="kserve/s3-init-kvldg" Apr 24 21:34:59.108299 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.108203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvldg" Apr 24 21:34:59.226747 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.226712 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kvldg"] Apr 24 21:34:59.229722 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:34:59.229699 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf952ab_8b2a_40b0_9931_11e5e87ee6bd.slice/crio-21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8 WatchSource:0}: Error finding container 21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8: Status 404 returned error can't find the container with id 21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8 Apr 24 21:34:59.231447 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.231430 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:59.310657 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:34:59.310622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvldg" event={"ID":"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd","Type":"ContainerStarted","Data":"21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8"} Apr 24 21:35:04.327113 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:04.327074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvldg" event={"ID":"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd","Type":"ContainerStarted","Data":"d0cd5dc33f29505c2d3fb26cf972b10d7f8b2863125338938529e2592e066e7b"} Apr 24 21:35:04.343267 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:04.343223 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-kvldg" podStartSLOduration=1.920267145 podStartE2EDuration="6.343208132s" podCreationTimestamp="2026-04-24 21:34:58 +0000 UTC" firstStartedPulling="2026-04-24 21:34:59.231581296 +0000 UTC m=+512.038937878" lastFinishedPulling="2026-04-24 21:35:03.654522283 +0000 UTC m=+516.461878865" observedRunningTime="2026-04-24 21:35:04.342232597 +0000 UTC m=+517.149589203" watchObservedRunningTime="2026-04-24 21:35:04.343208132 +0000 UTC m=+517.150564736" Apr 24 21:35:07.336448 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:07.336414 2574 generic.go:358] "Generic (PLEG): container finished" podID="fdf952ab-8b2a-40b0-9931-11e5e87ee6bd" containerID="d0cd5dc33f29505c2d3fb26cf972b10d7f8b2863125338938529e2592e066e7b" exitCode=0 Apr 24 21:35:07.336857 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:07.336487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvldg" event={"ID":"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd","Type":"ContainerDied","Data":"d0cd5dc33f29505c2d3fb26cf972b10d7f8b2863125338938529e2592e066e7b"} Apr 24 21:35:08.457993 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:08.457971 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvldg" Apr 24 21:35:08.604014 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:08.603899 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgtb2\" (UniqueName: \"kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2\") pod \"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd\" (UID: \"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd\") " Apr 24 21:35:08.606089 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:08.606062 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2" (OuterVolumeSpecName: "kube-api-access-qgtb2") pod "fdf952ab-8b2a-40b0-9931-11e5e87ee6bd" (UID: "fdf952ab-8b2a-40b0-9931-11e5e87ee6bd"). InnerVolumeSpecName "kube-api-access-qgtb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:08.704813 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:08.704778 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgtb2\" (UniqueName: \"kubernetes.io/projected/fdf952ab-8b2a-40b0-9931-11e5e87ee6bd-kube-api-access-qgtb2\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:35:09.343152 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:09.343119 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvldg" Apr 24 21:35:09.343334 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:09.343120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvldg" event={"ID":"fdf952ab-8b2a-40b0-9931-11e5e87ee6bd","Type":"ContainerDied","Data":"21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8"} Apr 24 21:35:09.343334 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:09.343234 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e939f08fa9adb4bdf2b19e5e2f4d3026c357b9cf3ca0e2f23326cb41aa41f8" Apr 24 21:35:16.369851 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.369757 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-pcwzc"] Apr 24 21:35:16.370202 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.370062 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdf952ab-8b2a-40b0-9931-11e5e87ee6bd" containerName="s3-init" Apr 24 21:35:16.370202 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.370073 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf952ab-8b2a-40b0-9931-11e5e87ee6bd" containerName="s3-init" Apr 24 21:35:16.370202 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.370129 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdf952ab-8b2a-40b0-9931-11e5e87ee6bd" containerName="s3-init" Apr 24 21:35:16.372868 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.372853 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:16.375206 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.375181 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:35:16.375344 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.375244 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:16.375344 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.375282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:16.375458 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.375390 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4bgsb\"" Apr 24 21:35:16.379773 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.379747 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-pcwzc"] Apr 24 21:35:16.476627 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.476582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw6z4\" (UniqueName: \"kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4\") pod \"s3-tls-init-custom-pcwzc\" (UID: \"3dd70a66-aa34-4642-a4fe-10785d0b74e4\") " pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:16.577930 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.577894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw6z4\" (UniqueName: \"kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4\") pod \"s3-tls-init-custom-pcwzc\" (UID: \"3dd70a66-aa34-4642-a4fe-10785d0b74e4\") " pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:16.586499 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.586468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw6z4\" (UniqueName: \"kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4\") pod \"s3-tls-init-custom-pcwzc\" (UID: \"3dd70a66-aa34-4642-a4fe-10785d0b74e4\") " pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:16.682353 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.682323 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:16.799797 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:16.799754 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-pcwzc"] Apr 24 21:35:16.802395 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:35:16.802362 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd70a66_aa34_4642_a4fe_10785d0b74e4.slice/crio-fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f WatchSource:0}: Error finding container fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f: Status 404 returned error can't find the container with id fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f Apr 24 21:35:17.367372 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:17.367333 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pcwzc" event={"ID":"3dd70a66-aa34-4642-a4fe-10785d0b74e4","Type":"ContainerStarted","Data":"0e1f6fb077e210ec5e81012b8e27b416b825f51257aec51cf1eb0e59fdb06994"} Apr 24 21:35:17.367372 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:17.367368 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pcwzc" event={"ID":"3dd70a66-aa34-4642-a4fe-10785d0b74e4","Type":"ContainerStarted","Data":"fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f"} Apr 24 21:35:17.385814 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:17.385770 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-pcwzc" podStartSLOduration=1.385756669 podStartE2EDuration="1.385756669s" podCreationTimestamp="2026-04-24 21:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:17.383472275 +0000 UTC m=+530.190828880" watchObservedRunningTime="2026-04-24 21:35:17.385756669 +0000 UTC m=+530.193113273" Apr 24 21:35:21.379185 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:21.379101 2574 generic.go:358] "Generic (PLEG): container finished" podID="3dd70a66-aa34-4642-a4fe-10785d0b74e4" containerID="0e1f6fb077e210ec5e81012b8e27b416b825f51257aec51cf1eb0e59fdb06994" exitCode=0 Apr 24 21:35:21.379627 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:21.379182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pcwzc" event={"ID":"3dd70a66-aa34-4642-a4fe-10785d0b74e4","Type":"ContainerDied","Data":"0e1f6fb077e210ec5e81012b8e27b416b825f51257aec51cf1eb0e59fdb06994"} Apr 24 21:35:22.504790 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:22.504761 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:22.632758 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:22.632654 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw6z4\" (UniqueName: \"kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4\") pod \"3dd70a66-aa34-4642-a4fe-10785d0b74e4\" (UID: \"3dd70a66-aa34-4642-a4fe-10785d0b74e4\") " Apr 24 21:35:22.634768 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:22.634741 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4" (OuterVolumeSpecName: "kube-api-access-kw6z4") pod "3dd70a66-aa34-4642-a4fe-10785d0b74e4" (UID: "3dd70a66-aa34-4642-a4fe-10785d0b74e4"). InnerVolumeSpecName "kube-api-access-kw6z4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:22.733963 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:22.733912 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kw6z4\" (UniqueName: \"kubernetes.io/projected/3dd70a66-aa34-4642-a4fe-10785d0b74e4-kube-api-access-kw6z4\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:35:23.385441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:23.385408 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-pcwzc" Apr 24 21:35:23.385441 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:23.385433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-pcwzc" event={"ID":"3dd70a66-aa34-4642-a4fe-10785d0b74e4","Type":"ContainerDied","Data":"fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f"} Apr 24 21:35:23.385646 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:23.385465 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc16e4d6beabd3f7b4f24a46add15dd5443eb235a10dbe145d1f422be32e2f6f" Apr 24 21:35:25.635287 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.635250 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-jrh6n"] Apr 24 21:35:25.635644 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.635533 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd70a66-aa34-4642-a4fe-10785d0b74e4" containerName="s3-tls-init-custom" Apr 24 21:35:25.635644 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.635544 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd70a66-aa34-4642-a4fe-10785d0b74e4" containerName="s3-tls-init-custom" Apr 24 21:35:25.635644 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.635592 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd70a66-aa34-4642-a4fe-10785d0b74e4" containerName="s3-tls-init-custom" Apr 24 21:35:25.638679 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.638661 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:25.640941 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.640915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:25.641047 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.640945 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 21:35:25.641912 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.641894 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:25.642013 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.641900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4bgsb\"" Apr 24 21:35:25.646542 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.646521 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-jrh6n"] Apr 24 21:35:25.657482 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.657457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn442\" (UniqueName: \"kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442\") pod \"s3-tls-init-serving-jrh6n\" (UID: \"06f7f7d1-6514-4447-8836-6c9f124fcb75\") " pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:25.758200 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.758171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn442\" (UniqueName: \"kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442\") pod \"s3-tls-init-serving-jrh6n\" (UID: \"06f7f7d1-6514-4447-8836-6c9f124fcb75\") " pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:25.767543 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.767513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn442\" (UniqueName: \"kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442\") pod \"s3-tls-init-serving-jrh6n\" (UID: \"06f7f7d1-6514-4447-8836-6c9f124fcb75\") " pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:25.947734 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:25.947705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:26.067624 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:26.067594 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-jrh6n"] Apr 24 21:35:26.070674 ip-10-0-137-28 kubenswrapper[2574]: W0424 21:35:26.070646 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06f7f7d1_6514_4447_8836_6c9f124fcb75.slice/crio-419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a WatchSource:0}: Error finding container 419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a: Status 404 returned error can't find the container with id 419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a Apr 24 21:35:26.395663 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:26.395573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jrh6n" event={"ID":"06f7f7d1-6514-4447-8836-6c9f124fcb75","Type":"ContainerStarted","Data":"b123dc757bcc2d38a4256c6ca5c6e68eb9f237fbe752736dcd682ddac67f2782"} Apr 24 21:35:26.395663 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:26.395614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jrh6n" event={"ID":"06f7f7d1-6514-4447-8836-6c9f124fcb75","Type":"ContainerStarted","Data":"419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a"} Apr 24 21:35:26.414014 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:26.413857 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-jrh6n" podStartSLOduration=1.413804324 podStartE2EDuration="1.413804324s" podCreationTimestamp="2026-04-24 21:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:26.413666412 +0000 UTC m=+539.221023020" watchObservedRunningTime="2026-04-24 21:35:26.413804324 +0000 UTC m=+539.221160927" Apr 24 21:35:31.411113 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:31.411076 2574 generic.go:358] "Generic (PLEG): container finished" podID="06f7f7d1-6514-4447-8836-6c9f124fcb75" containerID="b123dc757bcc2d38a4256c6ca5c6e68eb9f237fbe752736dcd682ddac67f2782" exitCode=0 Apr 24 21:35:31.411564 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:31.411152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jrh6n" event={"ID":"06f7f7d1-6514-4447-8836-6c9f124fcb75","Type":"ContainerDied","Data":"b123dc757bcc2d38a4256c6ca5c6e68eb9f237fbe752736dcd682ddac67f2782"} Apr 24 21:35:32.538480 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:32.538459 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:35:32.608475 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:32.608432 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn442\" (UniqueName: \"kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442\") pod \"06f7f7d1-6514-4447-8836-6c9f124fcb75\" (UID: \"06f7f7d1-6514-4447-8836-6c9f124fcb75\") " Apr 24 21:35:32.610388 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:32.610357 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442" (OuterVolumeSpecName: "kube-api-access-hn442") pod "06f7f7d1-6514-4447-8836-6c9f124fcb75" (UID: "06f7f7d1-6514-4447-8836-6c9f124fcb75"). InnerVolumeSpecName "kube-api-access-hn442". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:32.709353 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:32.709267 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hn442\" (UniqueName: \"kubernetes.io/projected/06f7f7d1-6514-4447-8836-6c9f124fcb75-kube-api-access-hn442\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 21:35:33.418212 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:33.418175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jrh6n" event={"ID":"06f7f7d1-6514-4447-8836-6c9f124fcb75","Type":"ContainerDied","Data":"419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a"} Apr 24 21:35:33.418212 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:33.418208 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419f549d258281218d056ecbdd5397f9c6ad8b478f38da387ba259b3f0cc7e6a" Apr 24 21:35:33.418212 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:35:33.418207 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jrh6n" Apr 24 21:36:27.716867 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:36:27.716822 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:36:27.717358 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:36:27.716902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:41:27.737644 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:41:27.737563 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:41:27.738249 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:41:27.737769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:46:27.757920 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:46:27.757884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:46:27.760198 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:46:27.760178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:51:27.777223 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:51:27.777194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:51:27.781050 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:51:27.781028 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:56:27.801303 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:56:27.801275 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 21:56:27.805376 ip-10-0-137-28 kubenswrapper[2574]: I0424 21:56:27.805356 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:01:27.824193 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:01:27.824168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:01:27.829624 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:01:27.829604 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:06:27.844127 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:06:27.844093 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:06:27.850452 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:06:27.850433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:11:27.863809 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:11:27.863732 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:11:27.870217 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:11:27.870197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:16:27.885576 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:16:27.885545 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:16:27.895424 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:16:27.895402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:21:27.910433 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:21:27.910395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:21:27.919252 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:21:27.919232 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:26:27.929670 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:26:27.929541 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:26:27.939008 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:26:27.938976 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:31:08.602970 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.602931 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q9brd/must-gather-c86kz"] Apr 24 22:31:08.603572 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.603551 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06f7f7d1-6514-4447-8836-6c9f124fcb75" containerName="s3-tls-init-serving" Apr 24 22:31:08.603641 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.603576 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f7f7d1-6514-4447-8836-6c9f124fcb75" containerName="s3-tls-init-serving" Apr 24 22:31:08.603739 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.603727 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="06f7f7d1-6514-4447-8836-6c9f124fcb75" containerName="s3-tls-init-serving" Apr 24 22:31:08.607451 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.607423 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.611485 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.611463 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-q9brd\"/\"kube-root-ca.crt\"" Apr 24 22:31:08.611779 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.611763 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-q9brd\"/\"openshift-service-ca.crt\"" Apr 24 22:31:08.618863 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.618843 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q9brd/must-gather-c86kz"] Apr 24 22:31:08.736350 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.736318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qpw\" (UniqueName: \"kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.736503 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.736376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.837163 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.837135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qpw\" (UniqueName: \"kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.837304 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.837186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.837452 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.837435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.845647 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.845621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qpw\" (UniqueName: \"kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw\") pod \"must-gather-c86kz\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:08.916196 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:08.916178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:09.026254 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:09.026216 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q9brd/must-gather-c86kz"] Apr 24 22:31:09.030479 ip-10-0-137-28 kubenswrapper[2574]: W0424 22:31:09.030445 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61eea04_20da_4698_b725_a3b48184013f.slice/crio-324db11fa978cdc7755f34032ccdae110e3b8d40a805f34ee022805a2f80c31e WatchSource:0}: Error finding container 324db11fa978cdc7755f34032ccdae110e3b8d40a805f34ee022805a2f80c31e: Status 404 returned error can't find the container with id 324db11fa978cdc7755f34032ccdae110e3b8d40a805f34ee022805a2f80c31e Apr 24 22:31:09.032136 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:09.032117 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:31:09.775473 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:09.775431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q9brd/must-gather-c86kz" event={"ID":"b61eea04-20da-4698-b725-a3b48184013f","Type":"ContainerStarted","Data":"324db11fa978cdc7755f34032ccdae110e3b8d40a805f34ee022805a2f80c31e"} Apr 24 22:31:13.789315 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:13.789275 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q9brd/must-gather-c86kz" event={"ID":"b61eea04-20da-4698-b725-a3b48184013f","Type":"ContainerStarted","Data":"3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8"} Apr 24 22:31:13.789315 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:13.789308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q9brd/must-gather-c86kz" event={"ID":"b61eea04-20da-4698-b725-a3b48184013f","Type":"ContainerStarted","Data":"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da"} Apr 24 22:31:13.806440 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:13.806385 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q9brd/must-gather-c86kz" podStartSLOduration=2.112499808 podStartE2EDuration="5.806372441s" podCreationTimestamp="2026-04-24 22:31:08 +0000 UTC" firstStartedPulling="2026-04-24 22:31:09.032284501 +0000 UTC m=+3881.839641084" lastFinishedPulling="2026-04-24 22:31:12.726157117 +0000 UTC m=+3885.533513717" observedRunningTime="2026-04-24 22:31:13.805404255 +0000 UTC m=+3886.612760862" watchObservedRunningTime="2026-04-24 22:31:13.806372441 +0000 UTC m=+3886.613729044" Apr 24 22:31:27.955689 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:27.955575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:31:27.968206 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:27.965460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:31:33.852056 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:33.852025 2574 generic.go:358] "Generic (PLEG): container finished" podID="b61eea04-20da-4698-b725-a3b48184013f" containerID="225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da" exitCode=0 Apr 24 22:31:33.852525 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:33.852087 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q9brd/must-gather-c86kz" event={"ID":"b61eea04-20da-4698-b725-a3b48184013f","Type":"ContainerDied","Data":"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da"} Apr 24 22:31:33.852525 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:33.852472 2574 scope.go:117] "RemoveContainer" containerID="225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da" Apr 24 22:31:34.835677 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:34.835650 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q9brd_must-gather-c86kz_b61eea04-20da-4698-b725-a3b48184013f/gather/0.log" Apr 24 22:31:38.276774 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:38.276744 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9fjb6_2b28d774-e7a8-450d-9ac2-f68dc752098e/global-pull-secret-syncer/0.log" Apr 24 22:31:38.407874 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:38.407842 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4lns9_bae04246-77d0-46d3-9aa4-2a74e4817f4f/konnectivity-agent/0.log" Apr 24 22:31:38.613088 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:38.613016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-28.ec2.internal_562a113f040bbc989a373b07efb12bcb/haproxy/0.log" Apr 24 22:31:40.335554 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.335517 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q9brd/must-gather-c86kz"] Apr 24 22:31:40.335991 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.335738 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-q9brd/must-gather-c86kz" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="copy" containerID="cri-o://3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8" gracePeriod=2 Apr 24 22:31:40.337962 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.337927 2574 status_manager.go:895] "Failed to get status for pod" podUID="b61eea04-20da-4698-b725-a3b48184013f" pod="openshift-must-gather-q9brd/must-gather-c86kz" err="pods \"must-gather-c86kz\" is forbidden: User \"system:node:ip-10-0-137-28.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-q9brd\": no relationship found between node 'ip-10-0-137-28.ec2.internal' and this object" Apr 24 22:31:40.338230 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.338212 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q9brd/must-gather-c86kz"] Apr 24 22:31:40.564474 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.564446 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q9brd_must-gather-c86kz_b61eea04-20da-4698-b725-a3b48184013f/copy/0.log" Apr 24 22:31:40.564785 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.564770 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:40.566924 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.566897 2574 status_manager.go:895] "Failed to get status for pod" podUID="b61eea04-20da-4698-b725-a3b48184013f" pod="openshift-must-gather-q9brd/must-gather-c86kz" err="pods \"must-gather-c86kz\" is forbidden: User \"system:node:ip-10-0-137-28.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-q9brd\": no relationship found between node 'ip-10-0-137-28.ec2.internal' and this object" Apr 24 22:31:40.601320 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.601237 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4qpw\" (UniqueName: \"kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw\") pod \"b61eea04-20da-4698-b725-a3b48184013f\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " Apr 24 22:31:40.601320 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.601282 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output\") pod \"b61eea04-20da-4698-b725-a3b48184013f\" (UID: \"b61eea04-20da-4698-b725-a3b48184013f\") " Apr 24 22:31:40.602851 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.602793 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b61eea04-20da-4698-b725-a3b48184013f" (UID: "b61eea04-20da-4698-b725-a3b48184013f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:40.603449 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.603420 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw" (OuterVolumeSpecName: "kube-api-access-l4qpw") pod "b61eea04-20da-4698-b725-a3b48184013f" (UID: "b61eea04-20da-4698-b725-a3b48184013f"). InnerVolumeSpecName "kube-api-access-l4qpw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:40.702098 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.702061 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4qpw\" (UniqueName: \"kubernetes.io/projected/b61eea04-20da-4698-b725-a3b48184013f-kube-api-access-l4qpw\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 22:31:40.702098 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.702093 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b61eea04-20da-4698-b725-a3b48184013f-must-gather-output\") on node \"ip-10-0-137-28.ec2.internal\" DevicePath \"\"" Apr 24 22:31:40.873027 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.872949 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q9brd_must-gather-c86kz_b61eea04-20da-4698-b725-a3b48184013f/copy/0.log" Apr 24 22:31:40.873282 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.873256 2574 generic.go:358] "Generic (PLEG): container finished" podID="b61eea04-20da-4698-b725-a3b48184013f" containerID="3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8" exitCode=143 Apr 24 22:31:40.873371 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.873324 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q9brd/must-gather-c86kz" Apr 24 22:31:40.873430 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.873364 2574 scope.go:117] "RemoveContainer" containerID="3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8" Apr 24 22:31:40.876175 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.876148 2574 status_manager.go:895] "Failed to get status for pod" podUID="b61eea04-20da-4698-b725-a3b48184013f" pod="openshift-must-gather-q9brd/must-gather-c86kz" err="pods \"must-gather-c86kz\" is forbidden: User \"system:node:ip-10-0-137-28.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-q9brd\": no relationship found between node 'ip-10-0-137-28.ec2.internal' and this object" Apr 24 22:31:40.880519 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.880496 2574 scope.go:117] "RemoveContainer" containerID="225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da" Apr 24 22:31:40.883340 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.883318 2574 status_manager.go:895] "Failed to get status for pod" podUID="b61eea04-20da-4698-b725-a3b48184013f" pod="openshift-must-gather-q9brd/must-gather-c86kz" err="pods \"must-gather-c86kz\" is forbidden: User \"system:node:ip-10-0-137-28.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-q9brd\": no relationship found between node 'ip-10-0-137-28.ec2.internal' and this object" Apr 24 22:31:40.892712 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.892690 2574 scope.go:117] "RemoveContainer" containerID="3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8" Apr 24 22:31:40.892958 ip-10-0-137-28 kubenswrapper[2574]: E0424 22:31:40.892938 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8\": container with ID starting with 3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8 not found: ID does not exist" containerID="3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8" Apr 24 22:31:40.893008 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.892966 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8"} err="failed to get container status \"3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8\": rpc error: code = NotFound desc = could not find container \"3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8\": container with ID starting with 3f88967d05c3c653f0154b4bcfca959883e0ab3d4d07516838cbe8ef8aa189d8 not found: ID does not exist" Apr 24 22:31:40.893008 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.892984 2574 scope.go:117] "RemoveContainer" containerID="225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da" Apr 24 22:31:40.893186 ip-10-0-137-28 kubenswrapper[2574]: E0424 22:31:40.893168 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da\": container with ID starting with 225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da not found: ID does not exist" containerID="225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da" Apr 24 22:31:40.893239 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:40.893195 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da"} err="failed to get container status \"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da\": rpc error: code = NotFound desc = could not find container \"225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da\": container with ID starting with 225a5d13a994369f58c506c31047b8a6a4f36fb81fc631af63ea249fdd4a72da not found: ID does not exist" Apr 24 22:31:41.787089 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:41.787060 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61eea04-20da-4698-b725-a3b48184013f" path="/var/lib/kubelet/pods/b61eea04-20da-4698-b725-a3b48184013f/volumes" Apr 24 22:31:42.079378 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.079297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-2rpxz_2eb9d760-626d-4d98-9d3e-3f022ca09d78/cluster-monitoring-operator/0.log" Apr 24 22:31:42.205798 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.205770 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8822h_3436e07c-cd57-497e-ab5a-9e3d447d77f8/monitoring-plugin/0.log" Apr 24 22:31:42.237068 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.237047 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/node-exporter/0.log" Apr 24 22:31:42.260242 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.260220 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/kube-rbac-proxy/0.log" Apr 24 22:31:42.281126 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.281103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hzb5l_89365de9-d284-465e-a69f-20a06b192232/init-textfile/0.log" Apr 24 22:31:42.555921 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.555890 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/prometheus/0.log" Apr 24 22:31:42.574553 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.574523 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/config-reloader/0.log" Apr 24 22:31:42.598297 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.598272 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/thanos-sidecar/0.log" Apr 24 22:31:42.619016 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.618993 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/kube-rbac-proxy-web/0.log" Apr 24 22:31:42.642676 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.642651 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/kube-rbac-proxy/0.log" Apr 24 22:31:42.666281 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.666251 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/kube-rbac-proxy-thanos/0.log" Apr 24 22:31:42.689278 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.689252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5543b5e0-dff7-4ddc-9c06-a4d3f79d6427/init-config-reloader/0.log" Apr 24 22:31:42.718385 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.718358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmqcd_7ce31d54-5e72-4ae6-b9c8-32c890858e6d/prometheus-operator/0.log" Apr 24 22:31:42.737499 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:42.737469 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rmqcd_7ce31d54-5e72-4ae6-b9c8-32c890858e6d/kube-rbac-proxy/0.log" Apr 24 22:31:45.314732 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.314705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7d4mv_66268dd3-4212-4861-bf78-8f224b9e2ec4/volume-data-source-validator/0.log" Apr 24 22:31:45.432870 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.432817 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r"] Apr 24 22:31:45.433228 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433208 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="gather" Apr 24 22:31:45.433331 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433231 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="gather" Apr 24 22:31:45.433331 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433251 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="copy" Apr 24 22:31:45.433331 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433258 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="copy" Apr 24 22:31:45.433474 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433343 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="gather" Apr 24 22:31:45.433474 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.433358 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b61eea04-20da-4698-b725-a3b48184013f" containerName="copy" Apr 24 22:31:45.438490 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.438464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.440744 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.440725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-j4vvx\"/\"default-dockercfg-c2k9q\"" Apr 24 22:31:45.441646 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.441632 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"kube-root-ca.crt\"" Apr 24 22:31:45.441720 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.441663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"openshift-service-ca.crt\"" Apr 24 22:31:45.443506 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.443488 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r"] Apr 24 22:31:45.538256 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.538219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-podres\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.538414 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.538264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-proc\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.538414 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.538297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-sys\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.538414 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.538322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2thnm\" (UniqueName: \"kubernetes.io/projected/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-kube-api-access-2thnm\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.538414 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.538349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-lib-modules\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.639771 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-sys\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.639771 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2thnm\" (UniqueName: \"kubernetes.io/projected/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-kube-api-access-2thnm\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.639956 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-sys\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.639956 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-lib-modules\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.639956 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-podres\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.640073 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.639986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-proc\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.640073 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.640026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-lib-modules\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.640073 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.640059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-proc\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.640167 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.640096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-podres\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.648223 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.648196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2thnm\" (UniqueName: \"kubernetes.io/projected/665dde60-d09b-4e8a-bf0a-75b53ddf4e33-kube-api-access-2thnm\") pod \"perf-node-gather-daemonset-4rn5r\" (UID: \"665dde60-d09b-4e8a-bf0a-75b53ddf4e33\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.749137 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.749097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:45.875100 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.875074 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r"] Apr 24 22:31:45.877501 ip-10-0-137-28 kubenswrapper[2574]: W0424 22:31:45.877475 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod665dde60_d09b_4e8a_bf0a_75b53ddf4e33.slice/crio-7bc849c7af419011abc56f06a8418808478da4f5d5d8e3d1d8af1a6ac21c0ec4 WatchSource:0}: Error finding container 7bc849c7af419011abc56f06a8418808478da4f5d5d8e3d1d8af1a6ac21c0ec4: Status 404 returned error can't find the container with id 7bc849c7af419011abc56f06a8418808478da4f5d5d8e3d1d8af1a6ac21c0ec4 Apr 24 22:31:45.890267 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:45.890214 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" event={"ID":"665dde60-d09b-4e8a-bf0a-75b53ddf4e33","Type":"ContainerStarted","Data":"7bc849c7af419011abc56f06a8418808478da4f5d5d8e3d1d8af1a6ac21c0ec4"} Apr 24 22:31:46.060757 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.060730 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w6kmm_258d3fbf-bfd8-408b-9638-e130192183f7/dns/0.log" Apr 24 22:31:46.086739 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.086713 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w6kmm_258d3fbf-bfd8-408b-9638-e130192183f7/kube-rbac-proxy/0.log" Apr 24 22:31:46.108526 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.108505 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5k6kg_00622ba2-e987-487b-870b-1558450fa114/dns-node-resolver/0.log" Apr 24 22:31:46.643624 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.643599 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l5t5z_8b42bf05-9792-4dd3-9486-e262d6b7afc8/node-ca/0.log" Apr 24 22:31:46.894502 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.894407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" event={"ID":"665dde60-d09b-4e8a-bf0a-75b53ddf4e33","Type":"ContainerStarted","Data":"e68e8947748a372ab3f8bf056779d19359a2599c16dc43063fa623ccfded0763"} Apr 24 22:31:46.894661 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.894586 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:46.912353 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:46.912305 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" podStartSLOduration=1.912292988 podStartE2EDuration="1.912292988s" podCreationTimestamp="2026-04-24 22:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:46.910387151 +0000 UTC m=+3919.717743767" watchObservedRunningTime="2026-04-24 22:31:46.912292988 +0000 UTC m=+3919.719649592" Apr 24 22:31:47.357103 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:47.357078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68fd45549b-hj7lt_28741c17-0dca-4049-bd6e-23d87c1354b0/router/0.log" Apr 24 22:31:47.684074 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:47.684051 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p9djn_07d12594-0cd4-4f7e-8c3a-c529a1051347/serve-healthcheck-canary/0.log" Apr 24 22:31:48.042760 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:48.042686 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zg8z7_2aa96248-3e79-4b6e-b5ab-600b84643235/insights-operator/0.log" Apr 24 22:31:48.044150 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:48.044133 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zg8z7_2aa96248-3e79-4b6e-b5ab-600b84643235/insights-operator/1.log" Apr 24 22:31:48.125509 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:48.125478 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hbtb6_3916a983-0b17-4fc6-aaf4-6f315f378c9f/kube-rbac-proxy/0.log" Apr 24 22:31:48.148024 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:48.148001 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hbtb6_3916a983-0b17-4fc6-aaf4-6f315f378c9f/exporter/0.log" Apr 24 22:31:48.171145 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:48.171115 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hbtb6_3916a983-0b17-4fc6-aaf4-6f315f378c9f/extractor/0.log" Apr 24 22:31:50.475663 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:50.475616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-kvldg_fdf952ab-8b2a-40b0-9931-11e5e87ee6bd/s3-init/0.log" Apr 24 22:31:50.499053 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:50.499027 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-pcwzc_3dd70a66-aa34-4642-a4fe-10785d0b74e4/s3-tls-init-custom/0.log" Apr 24 22:31:50.520721 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:50.520694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-jrh6n_06f7f7d1-6514-4447-8836-6c9f124fcb75/s3-tls-init-serving/0.log" Apr 24 22:31:52.906556 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:52.906530 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-4rn5r" Apr 24 22:31:55.028450 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:55.028412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nmnkl_30ae49ff-70be-49a5-864a-ffbc96166c41/kube-storage-version-migrator-operator/1.log" Apr 24 22:31:55.029358 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:55.029338 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nmnkl_30ae49ff-70be-49a5-864a-ffbc96166c41/kube-storage-version-migrator-operator/0.log" Apr 24 22:31:56.162119 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.162088 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:56.187061 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.187036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/egress-router-binary-copy/0.log" Apr 24 22:31:56.215507 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.215481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/cni-plugins/0.log" Apr 24 22:31:56.241493 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.241468 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/bond-cni-plugin/0.log" Apr 24 22:31:56.265823 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.265764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/routeoverride-cni/0.log" Apr 24 22:31:56.288325 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.288299 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/whereabouts-cni-bincopy/0.log" Apr 24 22:31:56.309920 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.309902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-glnmw_3cd95ab3-6a59-416f-8237-2554fc18b54f/whereabouts-cni/0.log" Apr 24 22:31:56.563780 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.563708 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-msllw_26ae5960-a77b-482e-891f-f7d7f829e0d2/kube-multus/0.log" Apr 24 22:31:56.584326 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.584301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9csmp_8ed80245-164d-4d1c-8ed3-05523db4cd57/network-metrics-daemon/0.log" Apr 24 22:31:56.609306 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:56.609288 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9csmp_8ed80245-164d-4d1c-8ed3-05523db4cd57/kube-rbac-proxy/0.log" Apr 24 22:31:57.813446 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.813410 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-controller/0.log" Apr 24 22:31:57.840729 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.840700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/0.log" Apr 24 22:31:57.859971 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.859941 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovn-acl-logging/1.log" Apr 24 22:31:57.881466 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.881433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/kube-rbac-proxy-node/0.log" Apr 24 22:31:57.905524 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.905493 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:57.923906 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.923883 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/northd/0.log" Apr 24 22:31:57.956143 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.956119 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/nbdb/0.log" Apr 24 22:31:57.987159 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:57.987122 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/sbdb/0.log" Apr 24 22:31:58.092273 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:58.092194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9dbf_8530569c-6697-47e7-b09a-7423346a9a16/ovnkube-controller/0.log" Apr 24 22:31:59.596976 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:31:59.596943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lqj24_a6bcdb22-0356-4540-8553-9a968d14ba41/network-check-target-container/0.log" Apr 24 22:32:00.449424 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:32:00.449398 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6h52t_9e5b0fff-78bc-4c1d-8ac3-bc6cf30d06de/iptables-alerter/0.log" Apr 24 22:32:01.130458 ip-10-0-137-28 kubenswrapper[2574]: I0424 22:32:01.130385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hwpr2_c1aab734-6300-41c5-9e50-7c87b69a3861/tuned/0.log"