Apr 20 09:58:56.284023 ip-10-0-140-95 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 09:58:56.284035 ip-10-0-140-95 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 09:58:56.284042 ip-10-0-140-95 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 09:58:56.284252 ip-10-0-140-95 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 09:59:06.396814 ip-10-0-140-95 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 09:59:06.396833 ip-10-0-140-95 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 826d7dbfaa8040d3897a8c79ebded887 -- Apr 20 10:01:13.456954 ip-10-0-140-95 systemd[1]: Starting Kubernetes Kubelet... Apr 20 10:01:13.898746 ip-10-0-140-95 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:13.898746 ip-10-0-140-95 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 10:01:13.898746 ip-10-0-140-95 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:13.898746 ip-10-0-140-95 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 10:01:13.898746 ip-10-0-140-95 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:13.901340 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.901270 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 10:01:13.907989 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.907968 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.907989 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.907986 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.907989 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.907990 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.907989 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.907993 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.907997 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908002 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908013 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908017 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908020 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908022 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908025 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908028 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908030 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908033 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908035 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908038 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908040 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908043 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908046 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908048 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908051 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908053 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908056 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.908141 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908058 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908061 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908063 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908067 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908078 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908081 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908084 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908087 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908089 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908092 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908094 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908097 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908099 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908102 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908105 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908108 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908111 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908114 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908117 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.908610 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908120 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908123 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908126 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908128 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908131 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908133 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908136 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908138 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908141 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908144 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908147 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908150 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908152 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908155 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908158 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908160 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908162 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908165 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908167 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908170 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.909077 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908173 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908175 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908177 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908180 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908183 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908187 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908190 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908192 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908195 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908197 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908200 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908202 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908205 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908207 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908210 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908212 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908222 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908225 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908227 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908230 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.909554 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908232 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908235 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908238 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908240 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908638 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908643 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908646 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908650 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908666 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908669 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908672 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908674 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908677 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908680 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908682 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908685 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908688 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908691 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908694 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.910053 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908697 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908699 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908702 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908705 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908707 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908710 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908712 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908715 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908718 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908726 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908729 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908732 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908734 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908737 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908739 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908742 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908745 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908747 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908750 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908752 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.910543 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908755 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908757 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908760 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908762 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908766 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908769 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908772 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908775 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908778 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908781 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908783 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908786 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908789 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908792 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908795 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908797 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908800 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908802 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908805 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908807 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.911052 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908810 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908812 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908820 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908823 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908825 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908828 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908831 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908833 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908835 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908838 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908840 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908843 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908845 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908847 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908850 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908853 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908856 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908858 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908861 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908864 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.911537 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908866 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908869 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908871 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908875 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908878 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908881 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908883 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908885 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908888 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908890 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.908893 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910200 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910209 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910218 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910221 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910233 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910236 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910240 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910244 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910247 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910250 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910253 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 10:01:13.912037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910256 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910259 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910262 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910265 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910268 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910271 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910274 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910277 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910283 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910286 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910289 2577 flags.go:64] FLAG: --config-dir="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910292 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910295 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910299 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910302 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910305 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910308 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910311 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910313 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910316 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910319 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910322 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910326 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910329 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910332 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 10:01:13.912561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910336 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910340 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910343 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910349 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910352 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910355 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910358 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910361 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910365 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910368 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910371 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910373 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910377 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910380 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910383 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910385 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910388 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910391 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910394 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910397 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910400 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910403 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910406 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910409 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910412 2577 flags.go:64] FLAG: --help="false" Apr 20 10:01:13.913174 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910414 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-140-95.ec2.internal" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910417 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910420 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910423 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910427 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910430 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910433 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910436 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910439 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910442 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910445 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910448 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910451 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910454 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910456 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910459 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910462 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910465 2577 flags.go:64] FLAG: --lock-file="" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910468 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910471 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910473 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910478 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910481 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910484 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 10:01:13.913781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910486 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910489 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910492 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910496 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910498 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910502 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910505 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910509 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910512 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910514 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910517 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910520 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910523 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910525 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910528 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910536 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910539 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910541 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910545 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910548 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910553 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910556 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910558 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910561 2577 flags.go:64] FLAG: --port="10250" Apr 20 10:01:13.914360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910564 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910567 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f75f020b87609077" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910570 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910572 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910575 2577 flags.go:64] FLAG: --register-node="true" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910578 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910580 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910584 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910587 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910589 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910592 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910596 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910599 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910602 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910604 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910607 2577 flags.go:64] FLAG: --runonce="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910610 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910613 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910615 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910618 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910621 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910624 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910627 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910630 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910633 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910636 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 10:01:13.914943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910639 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910642 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910645 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910648 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910651 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910672 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910677 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910680 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910684 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910687 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910690 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910693 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910696 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910698 2577 flags.go:64] FLAG: --v="2" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910702 2577 flags.go:64] FLAG: --version="false" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910706 2577 flags.go:64] FLAG: --vmodule="" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910710 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.910713 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910800 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910804 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910807 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910810 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910813 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.915588 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910816 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910819 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910822 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910825 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910828 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910831 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910834 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910837 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910839 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910842 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910845 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910848 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910850 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910853 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910855 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910858 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910860 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910863 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910865 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910869 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.916168 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910872 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910875 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910877 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910880 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910882 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910885 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910887 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910890 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910892 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910895 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910898 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910900 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910902 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910905 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910910 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910912 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910915 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910917 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910920 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910922 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.916675 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910925 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910927 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910930 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910933 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910936 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910938 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910941 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910944 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910946 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910949 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910951 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910954 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910956 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910959 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910961 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910964 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910966 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910970 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910973 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910975 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.917164 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910977 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910980 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910982 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910985 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910987 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910990 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910993 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910996 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.910999 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911001 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911003 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911006 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911008 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911011 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911014 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911018 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911021 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911024 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911027 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911030 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.917651 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.911033 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.918171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.911840 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.918171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.918072 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 10:01:13.918171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.918161 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918207 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918211 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918214 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918217 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918220 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918222 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918225 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918227 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918230 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918232 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918235 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918238 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918241 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918243 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918245 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918248 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918250 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918253 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918255 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.918259 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918258 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918260 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918263 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918266 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918269 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918271 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918275 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918277 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918280 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918283 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918285 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918287 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918290 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918292 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918295 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918297 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918300 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918302 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918305 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.918787 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918307 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918309 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918312 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918314 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918316 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918319 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918321 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918323 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918326 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918329 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918331 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918334 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918336 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918338 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918341 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918345 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918348 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918350 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918353 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918356 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.919233 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918360 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918363 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918366 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918368 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918371 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918375 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918378 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918381 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918384 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918387 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918389 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918392 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918394 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918397 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918400 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918402 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918405 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918407 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918409 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.919728 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918412 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918415 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918417 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918420 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918422 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918425 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918427 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918430 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918432 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.918437 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918526 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918530 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918534 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918536 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918539 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918542 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:13.920194 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918544 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918547 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918549 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918552 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918555 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918557 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918559 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918562 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918564 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918567 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918569 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918572 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918574 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918576 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918579 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918581 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918584 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918586 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918589 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918591 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:13.920591 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918593 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918596 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918598 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918601 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918603 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918606 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918608 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918611 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918613 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918617 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918621 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918623 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918626 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918629 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918631 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918634 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918636 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918639 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918641 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:13.921159 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918643 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918646 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918648 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918651 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918669 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918672 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918674 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918677 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918679 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918682 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918684 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918687 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918689 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918692 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918694 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918697 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918699 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918701 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918704 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:13.921615 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918707 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918710 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918712 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918715 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918719 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918722 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918726 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918729 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918732 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918734 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918737 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918739 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918742 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918744 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918746 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918749 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918752 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918754 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918756 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918759 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:13.922120 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918762 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:13.922580 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:13.918764 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:13.922580 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.918769 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:13.922580 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.919401 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 10:01:13.922781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.922767 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 10:01:13.923670 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.923645 2577 server.go:1019] "Starting client certificate rotation" Apr 20 10:01:13.923782 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.923763 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:13.923837 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.923816 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:13.945998 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.945979 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:13.951252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.951136 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:13.966229 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.966208 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 10:01:13.971523 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.971510 2577 log.go:25] "Validated CRI v1 image API" Apr 20 10:01:13.972706 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.972692 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 10:01:13.977060 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.977043 2577 fs.go:135] Filesystem UUIDs: map[17dd6b5b-23c7-4976-a375-ad3ca16edf21:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 bbbea4ed-329f-4fe6-b029-e82170f15de1:/dev/nvme0n1p3] Apr 20 10:01:13.977121 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.977061 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 10:01:13.981673 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.981561 2577 manager.go:217] Machine: {Timestamp:2026-04-20 10:01:13.980464642 +0000 UTC m=+0.403525776 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101362 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ea91e031faf91d3dcf8098145e2ba SystemUUID:ec2ea91e-031f-af91-d3dc-f8098145e2ba BootID:826d7dbf-aa80-40d3-897a-8c79ebded887 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:31:0b:05:48:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:31:0b:05:48:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:1a:ab:47:dd:05 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 10:01:13.981738 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.981675 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 10:01:13.981779 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.981767 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 10:01:13.983452 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.983430 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 10:01:13.983580 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.983455 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-95.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 10:01:13.983623 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.983591 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 10:01:13.983623 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.983599 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 10:01:13.983623 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.983612 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:13.984646 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.984635 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:13.985924 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.985915 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:13.986033 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.986025 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 10:01:13.989738 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.989727 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 10:01:13.989785 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.989752 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 10:01:13.989785 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.989763 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 10:01:13.989785 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.989774 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 10:01:13.989901 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.989792 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 10:01:13.990803 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.990792 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:13.990843 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.990810 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:13.992064 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.992049 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:13.993786 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.993764 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 10:01:13.995148 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.995137 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 10:01:13.999078 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999060 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 10:01:13.999078 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999082 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999091 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999100 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999110 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999126 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999135 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999144 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999155 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999165 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999177 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 10:01:13.999208 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:13.999190 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 10:01:14.000136 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.000116 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 10:01:14.000247 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.000158 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 10:01:14.003674 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.003641 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 10:01:14.003761 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.003708 2577 server.go:1295] "Started kubelet" Apr 20 10:01:14.003854 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.003811 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 10:01:14.003854 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.003819 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 10:01:14.003953 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.003882 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 10:01:14.004634 ip-10-0-140-95 systemd[1]: Started Kubernetes Kubelet. Apr 20 10:01:14.005034 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.005012 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 10:01:14.005277 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.005142 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 10:01:14.005277 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.005190 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 10:01:14.005277 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.005198 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-95.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 10:01:14.006280 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.006266 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 10:01:14.009523 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.009503 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xrwk" Apr 20 10:01:14.011268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.011253 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 10:01:14.011268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.011267 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:14.012113 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.012094 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.012404 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.012385 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 10:01:14.012880 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.011088 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-95.ec2.internal.18a80862eec9ac76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-95.ec2.internal,UID:ip-10-0-140-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-95.ec2.internal,},FirstTimestamp:2026-04-20 10:01:14.003672182 +0000 UTC m=+0.426733318,LastTimestamp:2026-04-20 10:01:14.003672182 +0000 UTC m=+0.426733318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-95.ec2.internal,}" Apr 20 10:01:14.012880 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.012806 2577 factory.go:55] Registering systemd factory Apr 20 10:01:14.012880 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.012827 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 10:01:14.013127 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013107 2577 factory.go:153] Registering CRI-O factory Apr 20 10:01:14.013127 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013127 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 10:01:14.013278 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013180 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 10:01:14.013278 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013203 2577 factory.go:103] Registering Raw factory Apr 20 10:01:14.013278 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013220 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 10:01:14.013437 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013396 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 10:01:14.013437 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013418 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 10:01:14.013567 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013551 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 10:01:14.013567 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013566 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 10:01:14.013687 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.013618 2577 manager.go:319] Starting recovery of all containers Apr 20 10:01:14.016863 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.016823 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 10:01:14.018745 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.018719 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xrwk" Apr 20 10:01:14.019043 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.019023 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 10:01:14.019108 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.019089 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 10:01:14.023746 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.023729 2577 manager.go:324] Recovery completed Apr 20 10:01:14.029161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.029147 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.030975 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.030962 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.031050 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.030989 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.031050 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.031002 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.031479 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.031464 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 10:01:14.031479 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.031479 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 10:01:14.031559 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.031493 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:14.033620 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.033561 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-95.ec2.internal.18a80862f06a4807 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-95.ec2.internal,UID:ip-10-0-140-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-95.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-95.ec2.internal,},FirstTimestamp:2026-04-20 10:01:14.030974983 +0000 UTC m=+0.454036116,LastTimestamp:2026-04-20 10:01:14.030974983 +0000 UTC m=+0.454036116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-95.ec2.internal,}" Apr 20 10:01:14.034552 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.034540 2577 policy_none.go:49] "None policy: Start" Apr 20 10:01:14.034600 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.034555 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 10:01:14.034600 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.034565 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 10:01:14.073180 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073167 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.073226 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073239 2577 server.go:85] "Starting device plugin registration server" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073427 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073436 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073538 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073612 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.073619 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.074396 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.074434 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.075766 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.077104 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.077126 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.077148 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.077157 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.077224 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 10:01:14.091817 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.078953 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:14.174535 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.174493 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.175929 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.175911 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.175999 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.175940 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.175999 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.175951 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.175999 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.175979 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.178080 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.178063 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal"] Apr 20 10:01:14.178132 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.178122 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.180061 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.180044 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.180151 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.180074 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.180151 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.180088 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.182234 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.182221 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.182385 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.182371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.182424 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.182402 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.183034 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183021 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.183086 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183031 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.183086 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183047 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.183086 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183055 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.183086 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183062 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.183086 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.183066 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.185144 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.185124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.185235 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.185158 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:14.186057 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.186040 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.186139 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.186065 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-95.ec2.internal\": node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.186139 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.186044 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:14.186232 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.186154 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:14.186232 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.186169 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:14.215223 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.215198 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-95.ec2.internal\" not found" node="ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.215223 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.215210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/36c01a4c8423a95e1f1d0a7a3f9614c6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-95.ec2.internal\" (UID: \"36c01a4c8423a95e1f1d0a7a3f9614c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.215351 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.215242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.215351 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.215270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.218836 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.218817 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.219527 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.219512 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-95.ec2.internal\" not found" node="ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.315838 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.315940 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.315940 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/36c01a4c8423a95e1f1d0a7a3f9614c6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-95.ec2.internal\" (UID: \"36c01a4c8423a95e1f1d0a7a3f9614c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.315940 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/36c01a4c8423a95e1f1d0a7a3f9614c6-config\") pod \"kube-apiserver-proxy-ip-10-0-140-95.ec2.internal\" (UID: \"36c01a4c8423a95e1f1d0a7a3f9614c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.315940 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.316099 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.315952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5bc7a240581bd3a3cc5a7e6319d273b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal\" (UID: \"e5bc7a240581bd3a3cc5a7e6319d273b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.318875 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.318860 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.419720 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.419695 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.517455 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.517392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.519792 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.519772 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.521953 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.521938 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:14.620009 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.619970 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.720575 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.720548 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.821081 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.821025 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.838262 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.838232 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:14.921584 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:14.921562 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:14.923723 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.923696 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 10:01:14.923830 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.923799 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 10:01:14.923883 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:14.923817 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 10:01:15.011597 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.011470 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:15.021176 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.021129 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 09:56:14 +0000 UTC" deadline="2027-09-16 05:16:29.79467533 +0000 UTC" Apr 20 10:01:15.021176 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.021174 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12331h15m14.773505064s" Apr 20 10:01:15.022360 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:15.022339 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:15.029178 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.029162 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:15.056386 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.056362 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-58g99" Apr 20 10:01:15.063826 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.063810 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-58g99" Apr 20 10:01:15.070926 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:15.070903 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c01a4c8423a95e1f1d0a7a3f9614c6.slice/crio-8732b04961a0bfe5f03e81975f3a9d9bc55dbbc9c7e9f87759a1bf84d5b2fb9a WatchSource:0}: Error finding container 8732b04961a0bfe5f03e81975f3a9d9bc55dbbc9c7e9f87759a1bf84d5b2fb9a: Status 404 returned error can't find the container with id 8732b04961a0bfe5f03e81975f3a9d9bc55dbbc9c7e9f87759a1bf84d5b2fb9a Apr 20 10:01:15.071384 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:15.071349 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bc7a240581bd3a3cc5a7e6319d273b.slice/crio-d451bc328017a69439dde98e419a5bb905d6da22987999a710eb415e4e7ee568 WatchSource:0}: Error finding container d451bc328017a69439dde98e419a5bb905d6da22987999a710eb415e4e7ee568: Status 404 returned error can't find the container with id d451bc328017a69439dde98e419a5bb905d6da22987999a710eb415e4e7ee568 Apr 20 10:01:15.075442 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.075431 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:01:15.080324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.080289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" event={"ID":"36c01a4c8423a95e1f1d0a7a3f9614c6","Type":"ContainerStarted","Data":"8732b04961a0bfe5f03e81975f3a9d9bc55dbbc9c7e9f87759a1bf84d5b2fb9a"} Apr 20 10:01:15.081196 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.081174 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" event={"ID":"e5bc7a240581bd3a3cc5a7e6319d273b","Type":"ContainerStarted","Data":"d451bc328017a69439dde98e419a5bb905d6da22987999a710eb415e4e7ee568"} Apr 20 10:01:15.123466 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:15.123444 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:15.223911 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:15.223889 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-95.ec2.internal\" not found" Apr 20 10:01:15.228631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.228607 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:15.312334 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.312316 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" Apr 20 10:01:15.327764 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.327717 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:15.328646 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.328634 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" Apr 20 10:01:15.335740 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.335723 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:15.586399 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.586327 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:15.990092 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:15.990031 2577 apiserver.go:52] "Watching apiserver" Apr 20 10:01:16.001858 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.001831 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 10:01:16.002198 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.002172 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-j4g46","kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal","openshift-cluster-node-tuning-operator/tuned-q7mlz","openshift-image-registry/node-ca-zsznm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal","openshift-multus/multus-additional-cni-plugins-4bp75","openshift-network-diagnostics/network-check-target-gm5vg","openshift-network-operator/iptables-alerter-tjxs7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth","openshift-dns/node-resolver-fr996","openshift-multus/multus-kwc9j","openshift-multus/network-metrics-daemon-vs775","openshift-ovn-kubernetes/ovnkube-node-bxbxw"] Apr 20 10:01:16.007179 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.007149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:16.007283 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.007232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:16.009236 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.009212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.009329 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.009292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.011369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.011352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.011901 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.011874 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.011901 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.011887 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sf22m\"" Apr 20 10:01:16.012093 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.012054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.012093 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.012089 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.012217 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.012119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.012217 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.012200 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 10:01:16.013308 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.013142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mbqcm\"" Apr 20 10:01:16.013843 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.013818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 10:01:16.014197 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014178 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvgjl\"" Apr 20 10:01:16.014284 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.014284 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 10:01:16.014284 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014240 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.014602 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.014700 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.014635 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 10:01:16.017811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.016494 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.017811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.016559 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 10:01:16.017811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.017015 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 10:01:16.017811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.017146 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnnxf\"" Apr 20 10:01:16.019039 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.019023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.019196 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.019182 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.019373 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.019358 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 10:01:16.019428 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.019407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n6kl2\"" Apr 20 10:01:16.021076 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.021057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.021185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.021122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.023023 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.022993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2eb7affb-8768-41ec-85fb-a62a41bb8709-serviceca\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.023117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.023117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-lib-modules\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023259 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-tmp\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023259 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9kx\" (UniqueName: \"kubernetes.io/projected/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-kube-api-access-dl9kx\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023259 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-os-release\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.023259 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-modprobe-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-sys\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eb7affb-8768-41ec-85fb-a62a41bb8709-host\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.023435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.023370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxq84\" (UniqueName: \"kubernetes.io/projected/ada0f1d6-3214-4751-9778-3af57b7e44c0-kube-api-access-wxq84\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.024923 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.024902 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.025074 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.025322 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025302 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 10:01:16.025514 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025497 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.025585 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-n4zld\"" Apr 20 10:01:16.025585 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mspqh\"" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.025991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysconfig\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-kubernetes\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cqx9\" (UniqueName: \"kubernetes.io/projected/2eb7affb-8768-41ec-85fb-a62a41bb8709-kube-api-access-7cqx9\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-system-cni-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026331 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-conf\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-tuned\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-cnibin\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-systemd\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-var-lib-kubelet\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-run\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-host\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff1633f6-fc91-4bfb-955c-d10341913ddc-agent-certs\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.026888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.026642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff1633f6-fc91-4bfb-955c-d10341913ddc-konnectivity-ca\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.027537 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.027164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 10:01:16.027537 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.027443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cspcd\"" Apr 20 10:01:16.029140 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.029119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.029140 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.029128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.029282 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.029253 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:16.031934 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.031917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 10:01:16.032023 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.031944 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 10:01:16.032023 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.031962 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 10:01:16.033193 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.033174 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-w9fd6\"" Apr 20 10:01:16.033267 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.033190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 10:01:16.033267 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.033232 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 10:01:16.033372 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.033195 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 10:01:16.064993 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.064944 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:15 +0000 UTC" deadline="2027-11-30 19:51:10.353689594 +0000 UTC" Apr 20 10:01:16.064993 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.064967 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14145h49m54.288724902s" Apr 20 10:01:16.113317 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.113293 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 10:01:16.127315 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-kubelet\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127460 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.127460 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2eb7affb-8768-41ec-85fb-a62a41bb8709-serviceca\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.127460 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.127460 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx9mm\" (UniqueName: \"kubernetes.io/projected/a533ff84-2d61-4ccb-9b58-1eea4acb387d-kube-api-access-gx9mm\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-k8s-cni-cncf-io\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-netns\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-config\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-lib-modules\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.127683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-tmp\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-bin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkm6b\" (UniqueName: \"kubernetes.io/projected/0d20a880-ecf7-405a-98f4-141bc115d61b-kube-api-access-rkm6b\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-bin\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-system-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2eb7affb-8768-41ec-85fb-a62a41bb8709-serviceca\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-conf-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.127972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-etc-kubernetes\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.127981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-systemd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-env-overrides\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-lib-modules\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-sys\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-sys\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxq84\" (UniqueName: \"kubernetes.io/projected/ada0f1d6-3214-4751-9778-3af57b7e44c0-kube-api-access-wxq84\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128387 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f23702f-f4ca-4d67-8cde-3f062233913d-iptables-alerter-script\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysconfig\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-kubernetes\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cqx9\" (UniqueName: \"kubernetes.io/projected/2eb7affb-8768-41ec-85fb-a62a41bb8709-kube-api-access-7cqx9\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysconfig\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-kubernetes\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-system-cni-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-system-cni-dir\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-hostroot\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-daemon-config\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-etc-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.128828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-tuned\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-cnibin\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.128913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-cnibin\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-device-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a533ff84-2d61-4ccb-9b58-1eea4acb387d-tmp-dir\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-socket-dir-parent\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-kubelet\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-var-lib-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-systemd\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-var-lib-kubelet\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-var-lib-kubelet\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-systemd\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-sys-fs\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-cnibin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.129486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqw6x\" (UniqueName: \"kubernetes.io/projected/6a07ac99-a265-4370-a43b-b11246f741de-kube-api-access-fqw6x\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-systemd-units\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-ovn\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-log-socket\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f23702f-f4ca-4d67-8cde-3f062233913d-host-slash\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-cni-binary-copy\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-multus\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-multus-certs\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-slash\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-script-lib\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.129980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9kx\" (UniqueName: \"kubernetes.io/projected/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-kube-api-access-dl9kx\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-os-release\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-socket-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-os-release\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ada0f1d6-3214-4751-9778-3af57b7e44c0-os-release\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.130244 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-netns\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-modprobe-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eb7affb-8768-41ec-85fb-a62a41bb8709-host\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-node-log\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-conf\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-registration-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-netd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deaf1642-316d-4307-8ade-dc653dd9e116-ovn-node-metrics-cert\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eb7affb-8768-41ec-85fb-a62a41bb8709-host\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130518 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwqbz\" (UniqueName: \"kubernetes.io/projected/deaf1642-316d-4307-8ade-dc653dd9e116-kube-api-access-dwqbz\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs947\" (UniqueName: \"kubernetes.io/projected/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kube-api-access-xs947\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-modprobe-d\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-run\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-host\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff1633f6-fc91-4bfb-955c-d10341913ddc-agent-certs\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.130841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff1633f6-fc91-4bfb-955c-d10341913ddc-konnectivity-ca\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ada0f1d6-3214-4751-9778-3af57b7e44c0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn46t\" (UniqueName: \"kubernetes.io/projected/5f23702f-f4ca-4d67-8cde-3f062233913d-kube-api-access-wn46t\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-etc-selinux\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a533ff84-2d61-4ccb-9b58-1eea4acb387d-hosts-file\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.130959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-sysctl-conf\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.131009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-run\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.131324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.131042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-host\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.131970 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.131917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ff1633f6-fc91-4bfb-955c-d10341913ddc-konnectivity-ca\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.132077 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.132008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-tmp\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.132077 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.132026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-etc-tuned\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.133649 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.133615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ff1633f6-fc91-4bfb-955c-d10341913ddc-agent-certs\") pod \"konnectivity-agent-j4g46\" (UID: \"ff1633f6-fc91-4bfb-955c-d10341913ddc\") " pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.135415 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.135384 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:16.135415 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.135409 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:16.135563 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.135428 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:16.135563 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.135501 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:16.63547661 +0000 UTC m=+3.058537732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:16.137639 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.137620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxq84\" (UniqueName: \"kubernetes.io/projected/ada0f1d6-3214-4751-9778-3af57b7e44c0-kube-api-access-wxq84\") pod \"multus-additional-cni-plugins-4bp75\" (UID: \"ada0f1d6-3214-4751-9778-3af57b7e44c0\") " pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.137873 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.137854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cqx9\" (UniqueName: \"kubernetes.io/projected/2eb7affb-8768-41ec-85fb-a62a41bb8709-kube-api-access-7cqx9\") pod \"node-ca-zsznm\" (UID: \"2eb7affb-8768-41ec-85fb-a62a41bb8709\") " pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.138721 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.138701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9kx\" (UniqueName: \"kubernetes.io/projected/18150fd2-df3c-4c5b-9a5b-726839bc0ccc-kube-api-access-dl9kx\") pod \"tuned-q7mlz\" (UID: \"18150fd2-df3c-4c5b-9a5b-726839bc0ccc\") " pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.231671 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-etc-selinux\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a533ff84-2d61-4ccb-9b58-1eea4acb387d-hosts-file\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-kubelet\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-etc-selinux\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx9mm\" (UniqueName: \"kubernetes.io/projected/a533ff84-2d61-4ccb-9b58-1eea4acb387d-kube-api-access-gx9mm\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a533ff84-2d61-4ccb-9b58-1eea4acb387d-hosts-file\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.231811 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-k8s-cni-cncf-io\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-netns\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-config\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-bin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkm6b\" (UniqueName: \"kubernetes.io/projected/0d20a880-ecf7-405a-98f4-141bc115d61b-kube-api-access-rkm6b\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-bin\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-system-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-kubelet\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-conf-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-bin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-bin\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.231795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-k8s-cni-cncf-io\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-etc-kubernetes\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-conf-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-etc-kubernetes\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-systemd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-env-overrides\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f23702f-f4ca-4d67-8cde-3f062233913d-iptables-alerter-script\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-hostroot\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-daemon-config\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-etc-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-device-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a533ff84-2d61-4ccb-9b58-1eea4acb387d-tmp-dir\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-socket-dir-parent\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-netns\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-kubelet\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-var-lib-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-sys-fs\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-cnibin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqw6x\" (UniqueName: \"kubernetes.io/projected/6a07ac99-a265-4370-a43b-b11246f741de-kube-api-access-fqw6x\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-config\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-etc-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-systemd-units\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-system-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-systemd-units\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.232631 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-ovn\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-device-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-log-socket\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f23702f-f4ca-4d67-8cde-3f062233913d-host-slash\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-cni-binary-copy\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-multus\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-multus-certs\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-slash\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-script-lib\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-socket-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-os-release\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a533ff84-2d61-4ccb-9b58-1eea4acb387d-tmp-dir\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-systemd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f23702f-f4ca-4d67-8cde-3f062233913d-host-slash\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-netns\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-ovn-kubernetes\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.233273 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-sys-fs\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-log-socket\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-cni-dir\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-ovn\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-cnibin\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-var-lib-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-socket-dir-parent\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-socket-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-kubelet\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-var-lib-cni-multus\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-os-release\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.233225 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-hostroot\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.233332 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:16.733315289 +0000 UTC m=+3.156376409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.232868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-run-netns\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-node-log\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-registration-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d20a880-ecf7-405a-98f4-141bc115d61b-host-run-multus-certs\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-netd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deaf1642-316d-4307-8ade-dc653dd9e116-ovn-node-metrics-cert\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-slash\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwqbz\" (UniqueName: \"kubernetes.io/projected/deaf1642-316d-4307-8ade-dc653dd9e116-kube-api-access-dwqbz\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-multus-daemon-config\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-registration-dir\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs947\" (UniqueName: \"kubernetes.io/projected/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kube-api-access-xs947\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-node-log\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-host-cni-netd\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d20a880-ecf7-405a-98f4-141bc115d61b-cni-binary-copy\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deaf1642-316d-4307-8ade-dc653dd9e116-run-openvswitch\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn46t\" (UniqueName: \"kubernetes.io/projected/5f23702f-f4ca-4d67-8cde-3f062233913d-kube-api-access-wn46t\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.233833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-ovnkube-script-lib\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.234730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.234049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5f23702f-f4ca-4d67-8cde-3f062233913d-iptables-alerter-script\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.235312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.234779 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deaf1642-316d-4307-8ade-dc653dd9e116-env-overrides\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.236620 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.236600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deaf1642-316d-4307-8ade-dc653dd9e116-ovn-node-metrics-cert\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.240532 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.240441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkm6b\" (UniqueName: \"kubernetes.io/projected/0d20a880-ecf7-405a-98f4-141bc115d61b-kube-api-access-rkm6b\") pod \"multus-kwc9j\" (UID: \"0d20a880-ecf7-405a-98f4-141bc115d61b\") " pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.240814 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.240690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx9mm\" (UniqueName: \"kubernetes.io/projected/a533ff84-2d61-4ccb-9b58-1eea4acb387d-kube-api-access-gx9mm\") pod \"node-resolver-fr996\" (UID: \"a533ff84-2d61-4ccb-9b58-1eea4acb387d\") " pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.242494 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.242452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqw6x\" (UniqueName: \"kubernetes.io/projected/6a07ac99-a265-4370-a43b-b11246f741de-kube-api-access-fqw6x\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.242646 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.242626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwqbz\" (UniqueName: \"kubernetes.io/projected/deaf1642-316d-4307-8ade-dc653dd9e116-kube-api-access-dwqbz\") pod \"ovnkube-node-bxbxw\" (UID: \"deaf1642-316d-4307-8ade-dc653dd9e116\") " pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.243172 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.243148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn46t\" (UniqueName: \"kubernetes.io/projected/5f23702f-f4ca-4d67-8cde-3f062233913d-kube-api-access-wn46t\") pod \"iptables-alerter-tjxs7\" (UID: \"5f23702f-f4ca-4d67-8cde-3f062233913d\") " pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.243245 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.243175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs947\" (UniqueName: \"kubernetes.io/projected/ab3edac7-ffce-45ee-83d2-17c48dd51ac6-kube-api-access-xs947\") pod \"aws-ebs-csi-driver-node-nztth\" (UID: \"ab3edac7-ffce-45ee-83d2-17c48dd51ac6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.321021 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.320988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" Apr 20 10:01:16.327772 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.327746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zsznm" Apr 20 10:01:16.336385 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.336368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bp75" Apr 20 10:01:16.341758 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.341738 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:16.347213 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.347194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjxs7" Apr 20 10:01:16.354766 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.354750 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" Apr 20 10:01:16.360237 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.360220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fr996" Apr 20 10:01:16.365764 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.365743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kwc9j" Apr 20 10:01:16.371295 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.371268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:16.402508 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.402486 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:16.635932 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.635906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:16.636099 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.636084 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:16.636151 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.636104 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:16.636151 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.636114 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:16.636236 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.636153 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:17.636140329 +0000 UTC m=+4.059201450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:16.637533 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.637471 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda533ff84_2d61_4ccb_9b58_1eea4acb387d.slice/crio-14326f8dd571a7fcd337ad211622870fdceaf46c810621f66ba071f815680af2 WatchSource:0}: Error finding container 14326f8dd571a7fcd337ad211622870fdceaf46c810621f66ba071f815680af2: Status 404 returned error can't find the container with id 14326f8dd571a7fcd337ad211622870fdceaf46c810621f66ba071f815680af2 Apr 20 10:01:16.638376 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.638357 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeaf1642_316d_4307_8ade_dc653dd9e116.slice/crio-d2226eb8d7be3c31a54a11257bcadd697a80a82609e2ebbcec0c3cfc571bda7c WatchSource:0}: Error finding container d2226eb8d7be3c31a54a11257bcadd697a80a82609e2ebbcec0c3cfc571bda7c: Status 404 returned error can't find the container with id d2226eb8d7be3c31a54a11257bcadd697a80a82609e2ebbcec0c3cfc571bda7c Apr 20 10:01:16.640397 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.640034 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d20a880_ecf7_405a_98f4_141bc115d61b.slice/crio-cefdf67ed09d411f7a20d36e986df90ce05c182bf027bb44f8ae92a4f45c2ee7 WatchSource:0}: Error finding container cefdf67ed09d411f7a20d36e986df90ce05c182bf027bb44f8ae92a4f45c2ee7: Status 404 returned error can't find the container with id cefdf67ed09d411f7a20d36e986df90ce05c182bf027bb44f8ae92a4f45c2ee7 Apr 20 10:01:16.642239 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.642216 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18150fd2_df3c_4c5b_9a5b_726839bc0ccc.slice/crio-17ed7a2b7c1ba2c6f3301183a52866886680845be8df033cf8ecfb0d5d6ebb67 WatchSource:0}: Error finding container 17ed7a2b7c1ba2c6f3301183a52866886680845be8df033cf8ecfb0d5d6ebb67: Status 404 returned error can't find the container with id 17ed7a2b7c1ba2c6f3301183a52866886680845be8df033cf8ecfb0d5d6ebb67 Apr 20 10:01:16.642796 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.642753 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb7affb_8768_41ec_85fb_a62a41bb8709.slice/crio-77ca318fe1f1669a855928b43263cb393b25a8b2ba2266560dd9fd52840b2b1e WatchSource:0}: Error finding container 77ca318fe1f1669a855928b43263cb393b25a8b2ba2266560dd9fd52840b2b1e: Status 404 returned error can't find the container with id 77ca318fe1f1669a855928b43263cb393b25a8b2ba2266560dd9fd52840b2b1e Apr 20 10:01:16.647212 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.644806 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f23702f_f4ca_4d67_8cde_3f062233913d.slice/crio-683aa4c2d02fd6ea6039f9cad6106566b948f2827ca3e3700b3d0aaf388184db WatchSource:0}: Error finding container 683aa4c2d02fd6ea6039f9cad6106566b948f2827ca3e3700b3d0aaf388184db: Status 404 returned error can't find the container with id 683aa4c2d02fd6ea6039f9cad6106566b948f2827ca3e3700b3d0aaf388184db Apr 20 10:01:16.647212 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:16.646377 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3edac7_ffce_45ee_83d2_17c48dd51ac6.slice/crio-9e5225509a600bd9845a2a4c44a93dd94fb97381de738ff98415526460393051 WatchSource:0}: Error finding container 9e5225509a600bd9845a2a4c44a93dd94fb97381de738ff98415526460393051: Status 404 returned error can't find the container with id 9e5225509a600bd9845a2a4c44a93dd94fb97381de738ff98415526460393051 Apr 20 10:01:16.736828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:16.736806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:16.736969 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.736945 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:16.737070 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:16.737019 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:17.736999441 +0000 UTC m=+4.160060574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:17.065445 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.065402 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:15 +0000 UTC" deadline="2027-12-13 01:13:29.615308196 +0000 UTC" Apr 20 10:01:17.065445 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.065442 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14439h12m12.549869242s" Apr 20 10:01:17.089201 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.089142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zsznm" event={"ID":"2eb7affb-8768-41ec-85fb-a62a41bb8709","Type":"ContainerStarted","Data":"77ca318fe1f1669a855928b43263cb393b25a8b2ba2266560dd9fd52840b2b1e"} Apr 20 10:01:17.090843 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.090780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" event={"ID":"18150fd2-df3c-4c5b-9a5b-726839bc0ccc","Type":"ContainerStarted","Data":"17ed7a2b7c1ba2c6f3301183a52866886680845be8df033cf8ecfb0d5d6ebb67"} Apr 20 10:01:17.092815 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.092760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwc9j" event={"ID":"0d20a880-ecf7-405a-98f4-141bc115d61b","Type":"ContainerStarted","Data":"cefdf67ed09d411f7a20d36e986df90ce05c182bf027bb44f8ae92a4f45c2ee7"} Apr 20 10:01:17.099251 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.099190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fr996" event={"ID":"a533ff84-2d61-4ccb-9b58-1eea4acb387d","Type":"ContainerStarted","Data":"14326f8dd571a7fcd337ad211622870fdceaf46c810621f66ba071f815680af2"} Apr 20 10:01:17.100448 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.100405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"d2226eb8d7be3c31a54a11257bcadd697a80a82609e2ebbcec0c3cfc571bda7c"} Apr 20 10:01:17.103771 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.103726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j4g46" event={"ID":"ff1633f6-fc91-4bfb-955c-d10341913ddc","Type":"ContainerStarted","Data":"f4c9af81d49568182ee5290100b20cbbf822a19aa9974f69c815eea8303233bc"} Apr 20 10:01:17.111332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.111305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerStarted","Data":"93c88c8050a9c151c031e2809707021a5edc23de304d2cf85d1da0bf7e63cd23"} Apr 20 10:01:17.116215 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.116137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" event={"ID":"36c01a4c8423a95e1f1d0a7a3f9614c6","Type":"ContainerStarted","Data":"cbb622a26ed58660504bb8b41281a9909454a41ba9279fcd42babfd4d9a277ee"} Apr 20 10:01:17.123788 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.122116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" event={"ID":"ab3edac7-ffce-45ee-83d2-17c48dd51ac6","Type":"ContainerStarted","Data":"9e5225509a600bd9845a2a4c44a93dd94fb97381de738ff98415526460393051"} Apr 20 10:01:17.125344 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.125320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjxs7" event={"ID":"5f23702f-f4ca-4d67-8cde-3f062233913d","Type":"ContainerStarted","Data":"683aa4c2d02fd6ea6039f9cad6106566b948f2827ca3e3700b3d0aaf388184db"} Apr 20 10:01:17.134439 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.134395 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-95.ec2.internal" podStartSLOduration=2.134380831 podStartE2EDuration="2.134380831s" podCreationTimestamp="2026-04-20 10:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:17.133925914 +0000 UTC m=+3.556987069" watchObservedRunningTime="2026-04-20 10:01:17.134380831 +0000 UTC m=+3.557441975" Apr 20 10:01:17.645929 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.645853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:17.646096 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.645999 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:17.646096 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.646018 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:17.646096 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.646031 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:17.646096 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.646083 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:19.646065554 +0000 UTC m=+6.069126678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:17.747428 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:17.746867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:17.747428 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.747041 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:17.747428 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:17.747098 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:19.747080009 +0000 UTC m=+6.170141148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:18.033250 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:18.033168 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:18.079562 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:18.079528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:18.079987 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:18.079645 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:18.080075 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:18.080041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:18.080160 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:18.080138 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:18.134051 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:18.134019 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5bc7a240581bd3a3cc5a7e6319d273b" containerID="ca906e9dd57d541730dc693e2d673af128d0a1ee981f8ec28a42677d3b8c540f" exitCode=0 Apr 20 10:01:18.134733 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:18.134710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" event={"ID":"e5bc7a240581bd3a3cc5a7e6319d273b","Type":"ContainerDied","Data":"ca906e9dd57d541730dc693e2d673af128d0a1ee981f8ec28a42677d3b8c540f"} Apr 20 10:01:19.145216 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:19.145166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" event={"ID":"e5bc7a240581bd3a3cc5a7e6319d273b","Type":"ContainerStarted","Data":"dc6205e94bddc7506a36b2d6209da66344a6325e18031ac151764078dfaf825c"} Apr 20 10:01:19.662504 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:19.662469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:19.662737 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.662710 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:19.662737 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.662735 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:19.662932 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.662750 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:19.662932 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.662799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:23.662786873 +0000 UTC m=+10.085847994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:19.763533 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:19.763287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:19.763533 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.763447 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:19.763533 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:19.763494 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:23.763481041 +0000 UTC m=+10.186542162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:20.078919 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:20.077625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:20.078919 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:20.077788 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:20.078919 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:20.078725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:20.078919 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:20.078831 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:22.078854 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:22.078334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:22.078854 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:22.078475 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:22.078854 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:22.078754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:22.079400 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:22.078924 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:23.695738 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:23.695474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:23.696145 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.695647 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:23.696145 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.695820 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:23.696145 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.695835 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:23.696145 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.695898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:31.695879905 +0000 UTC m=+18.118941048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:23.796393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:23.796351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:23.796533 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.796490 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:23.796576 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:23.796550 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:31.79652952 +0000 UTC m=+18.219590659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:24.079757 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:24.079183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:24.079757 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:24.079286 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:24.079757 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:24.079336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:24.079757 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:24.079402 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:26.077704 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.077649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:26.078113 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.077682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:26.078113 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:26.077846 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:26.078113 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:26.077951 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:26.759867 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.759823 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-95.ec2.internal" podStartSLOduration=11.759809042 podStartE2EDuration="11.759809042s" podCreationTimestamp="2026-04-20 10:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:19.163899615 +0000 UTC m=+5.586960762" watchObservedRunningTime="2026-04-20 10:01:26.759809042 +0000 UTC m=+13.182870184" Apr 20 10:01:26.760535 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.760506 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5r7zs"] Apr 20 10:01:26.763332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.763312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.763464 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:26.763383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:26.819059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.819032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.819184 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.819064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-kubelet-config\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.819184 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.819087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-dbus\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920280 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.920247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920427 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.920296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-kubelet-config\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920427 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.920324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-dbus\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920427 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.920388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-kubelet-config\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920427 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:26.920403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9968b560-1fb0-4930-8c96-a8878efe7d90-dbus\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:26.920616 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:26.920425 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:26.920616 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:26.920497 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:27.420476876 +0000 UTC m=+13.843538019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:27.425692 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:27.425641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:27.426143 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:27.425803 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:27.426143 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:27.425872 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:28.425857781 +0000 UTC m=+14.848918901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:28.077921 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:28.077886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:28.078082 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:28.078015 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:28.078162 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:28.078089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:28.078213 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:28.078176 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:28.435150 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:28.435056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:28.435533 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:28.435181 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:28.435533 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:28.435240 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:30.435223341 +0000 UTC m=+16.858284462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:29.077727 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:29.077690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:29.077899 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:29.077813 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:30.077950 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:30.077917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:30.078392 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:30.077957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:30.078392 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:30.078042 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:30.078392 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:30.078146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:30.448855 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:30.448780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:30.449020 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:30.448923 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:30.449020 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:30.448979 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:34.4489647 +0000 UTC m=+20.872025821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:31.077971 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:31.077938 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:31.078387 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.078065 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:31.758819 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:31.758783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:31.758998 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.758968 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:31.758998 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.758991 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:31.759125 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.759004 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xz858 for pod openshift-network-diagnostics/network-check-target-gm5vg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:31.759125 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.759052 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858 podName:ff96330b-c86e-4eab-8d6f-a6db1b630272 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.759038994 +0000 UTC m=+34.182100134 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz858" (UniqueName: "kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858") pod "network-check-target-gm5vg" (UID: "ff96330b-c86e-4eab-8d6f-a6db1b630272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:31.860123 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:31.860086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:31.860274 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.860206 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:31.860274 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:31.860269 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.86024945 +0000 UTC m=+34.283310605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:32.077590 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:32.077508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:32.077590 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:32.077541 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:32.077906 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:32.077643 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:32.077906 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:32.077748 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:33.077498 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:33.077467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:33.078074 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:33.077575 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:34.080473 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.079688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:34.080473 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:34.079822 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:34.080473 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.080286 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:34.080473 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:34.080366 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:34.169809 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.169560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zsznm" event={"ID":"2eb7affb-8768-41ec-85fb-a62a41bb8709","Type":"ContainerStarted","Data":"a9c6e560af3639446df19decafda7df6872de0616cf847b2928f3a804e83ade4"} Apr 20 10:01:34.173691 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.173646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" event={"ID":"18150fd2-df3c-4c5b-9a5b-726839bc0ccc","Type":"ContainerStarted","Data":"9f26b6bf48279baeaa8b156fa1c302a4a54c0294157ed44b384dbeb9bd32c8a0"} Apr 20 10:01:34.192740 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.192620 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zsznm" podStartSLOduration=2.993427176 podStartE2EDuration="20.192608545s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.646108748 +0000 UTC m=+3.069169870" lastFinishedPulling="2026-04-20 10:01:33.845290102 +0000 UTC m=+20.268351239" observedRunningTime="2026-04-20 10:01:34.192460573 +0000 UTC m=+20.615521717" watchObservedRunningTime="2026-04-20 10:01:34.192608545 +0000 UTC m=+20.615669688" Apr 20 10:01:34.213747 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.213512 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-q7mlz" podStartSLOduration=3.012117972 podStartE2EDuration="20.213494898s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.643917122 +0000 UTC m=+3.066978259" lastFinishedPulling="2026-04-20 10:01:33.845294065 +0000 UTC m=+20.268355185" observedRunningTime="2026-04-20 10:01:34.212908276 +0000 UTC m=+20.635969419" watchObservedRunningTime="2026-04-20 10:01:34.213494898 +0000 UTC m=+20.636556045" Apr 20 10:01:34.481324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:34.481295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:34.481431 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:34.481401 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:34.481484 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:34.481444 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:42.481432 +0000 UTC m=+28.904493126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:35.077954 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.077782 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:35.078079 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:35.078040 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:35.179359 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.179300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" event={"ID":"ab3edac7-ffce-45ee-83d2-17c48dd51ac6","Type":"ContainerStarted","Data":"8d91ff60f743e39f0bf861ada2dd87559c730f9f1833f8424f7fc6fe84a80ab2"} Apr 20 10:01:35.180509 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.180486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjxs7" event={"ID":"5f23702f-f4ca-4d67-8cde-3f062233913d","Type":"ContainerStarted","Data":"986c0eed82c007a02d3cf752466ca684f976712df459ef86c446f002765f2685"} Apr 20 10:01:35.181676 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.181631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kwc9j" event={"ID":"0d20a880-ecf7-405a-98f4-141bc115d61b","Type":"ContainerStarted","Data":"3d9d6acd400b6041a1d377dd73ff6a31e30a1b987074b0d0a7aace59044e6b73"} Apr 20 10:01:35.182788 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.182769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fr996" event={"ID":"a533ff84-2d61-4ccb-9b58-1eea4acb387d","Type":"ContainerStarted","Data":"f2dc59584a58f4ffe95eba58a94ffa7d34679b7e5cc246aaf90e946940f59761"} Apr 20 10:01:35.185103 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:01:35.185389 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185372 2577 generic.go:358] "Generic (PLEG): container finished" podID="deaf1642-316d-4307-8ade-dc653dd9e116" containerID="d3c68cdd6d633927e8bc2469cd953e5e7107d7e2152120116cc816124abd594e" exitCode=1 Apr 20 10:01:35.185436 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"04a916ac75cb3e2fc5431e98b7e95b8cbbf6a2498c71ed1cf0556aaf2ec7a3b2"} Apr 20 10:01:35.185472 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"0b8d7554acf5c446f27acd8b3aedf2b75e56ab90f0e33b698cf9d14ede7f7e74"} Apr 20 10:01:35.185472 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"f15e04595110f8c709cec485900a0993655f7b5045159d873891e898122eade1"} Apr 20 10:01:35.185472 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"755700d6fd30257d8a0947613057812858fd9cdf8bbaff11a3223ca78eec4965"} Apr 20 10:01:35.185573 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerDied","Data":"d3c68cdd6d633927e8bc2469cd953e5e7107d7e2152120116cc816124abd594e"} Apr 20 10:01:35.185573 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.185489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"1ec7217da797309b1ec61e357224fb8a6dbb347a21cf2de4f20b183e73bbfb39"} Apr 20 10:01:35.186548 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.186528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j4g46" event={"ID":"ff1633f6-fc91-4bfb-955c-d10341913ddc","Type":"ContainerStarted","Data":"7447990dcfab0fbdd0a325258f4d83ccbaf6ff1fd342d3b4d14256bb9a8cc146"} Apr 20 10:01:35.187779 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.187760 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="3804101feb3e6b61f1f9ac1d4cccb135b72bd4b59ccbeae71e4c7bfaf808d64c" exitCode=0 Apr 20 10:01:35.187856 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.187839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"3804101feb3e6b61f1f9ac1d4cccb135b72bd4b59ccbeae71e4c7bfaf808d64c"} Apr 20 10:01:35.223382 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.223345 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tjxs7" podStartSLOduration=3.9802464 podStartE2EDuration="21.223334462s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.646435033 +0000 UTC m=+3.069496154" lastFinishedPulling="2026-04-20 10:01:33.889523094 +0000 UTC m=+20.312584216" observedRunningTime="2026-04-20 10:01:35.199501289 +0000 UTC m=+21.622562432" watchObservedRunningTime="2026-04-20 10:01:35.223334462 +0000 UTC m=+21.646395605" Apr 20 10:01:35.240241 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.240207 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j4g46" podStartSLOduration=4.032511156 podStartE2EDuration="21.240198109s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.63762378 +0000 UTC m=+3.060684913" lastFinishedPulling="2026-04-20 10:01:33.845310745 +0000 UTC m=+20.268371866" observedRunningTime="2026-04-20 10:01:35.223450191 +0000 UTC m=+21.646511322" watchObservedRunningTime="2026-04-20 10:01:35.240198109 +0000 UTC m=+21.663259256" Apr 20 10:01:35.240527 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.240504 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fr996" podStartSLOduration=3.9882077110000003 podStartE2EDuration="21.240499799s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.639291578 +0000 UTC m=+3.062352699" lastFinishedPulling="2026-04-20 10:01:33.89158365 +0000 UTC m=+20.314644787" observedRunningTime="2026-04-20 10:01:35.240166821 +0000 UTC m=+21.663227964" watchObservedRunningTime="2026-04-20 10:01:35.240499799 +0000 UTC m=+21.663560942" Apr 20 10:01:35.286142 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.286098 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kwc9j" podStartSLOduration=3.807856557 podStartE2EDuration="21.286085869s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.642419411 +0000 UTC m=+3.065480535" lastFinishedPulling="2026-04-20 10:01:34.120648713 +0000 UTC m=+20.543709847" observedRunningTime="2026-04-20 10:01:35.260458714 +0000 UTC m=+21.683519866" watchObservedRunningTime="2026-04-20 10:01:35.286085869 +0000 UTC m=+21.709147013" Apr 20 10:01:35.494565 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.494511 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 10:01:35.752294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:35.752212 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:36.078441 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.078370 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:36.078704 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:36.078491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:36.078825 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.078369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:36.078950 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:36.078917 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:36.084363 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.084271 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T10:01:35.494528347Z","UUID":"d663ea7a-cf11-40d8-81b3-e71bc5657b62","Handler":null,"Name":"","Endpoint":""} Apr 20 10:01:36.087642 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.087620 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 10:01:36.087762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.087652 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 10:01:36.192204 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:36.192168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" event={"ID":"ab3edac7-ffce-45ee-83d2-17c48dd51ac6","Type":"ContainerStarted","Data":"68fcdaaf48b7080e956eb5790344eb1a76b6ddf25e124f59af479118c2fa7c5c"} Apr 20 10:01:37.050082 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.050042 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:37.050963 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.050921 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:37.077947 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.077921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:37.078070 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:37.078024 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:37.197024 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.196954 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:01:37.197589 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.197358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"9a00b66570b17e60b813dd5d52d2685ede54e72d4d415c6fb4d27d659b8a01c7"} Apr 20 10:01:37.199402 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.199376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" event={"ID":"ab3edac7-ffce-45ee-83d2-17c48dd51ac6","Type":"ContainerStarted","Data":"4ed341cf23b8792004b72ffc29620338ce4b9e5c9210653347c2c994758aee62"} Apr 20 10:01:37.200115 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.200092 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j4g46" Apr 20 10:01:37.218937 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:37.218889 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nztth" podStartSLOduration=3.266455838 podStartE2EDuration="23.218877907s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.648180844 +0000 UTC m=+3.071241968" lastFinishedPulling="2026-04-20 10:01:36.600602914 +0000 UTC m=+23.023664037" observedRunningTime="2026-04-20 10:01:37.218696865 +0000 UTC m=+23.641758009" watchObservedRunningTime="2026-04-20 10:01:37.218877907 +0000 UTC m=+23.641939050" Apr 20 10:01:38.077540 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:38.077501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:38.078058 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:38.078021 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:38.078164 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:38.078088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:38.078264 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:38.078244 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:39.077767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:39.077733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:39.078264 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:39.077848 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:40.077762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.077558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:40.077927 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.077620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:40.077927 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:40.077854 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:40.077927 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:40.077882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:40.207381 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.207357 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:01:40.207700 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.207679 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"8e1fda0de11fd739975babcd6d79e722ec685fbfa2e29285af0cb78a2407f89d"} Apr 20 10:01:40.208037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.208020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:40.208037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.208047 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:40.208159 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.208141 2577 scope.go:117] "RemoveContainer" containerID="d3c68cdd6d633927e8bc2469cd953e5e7107d7e2152120116cc816124abd594e" Apr 20 10:01:40.209431 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.209405 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="9f5ec4ff1d862f2536470510e42c38dc67b2dee89427ab6665cdadd7d9d55a1c" exitCode=0 Apr 20 10:01:40.209507 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.209441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"9f5ec4ff1d862f2536470510e42c38dc67b2dee89427ab6665cdadd7d9d55a1c"} Apr 20 10:01:40.225550 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:40.225533 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:41.077653 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.077504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:41.077763 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:41.077741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:41.117795 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.117730 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5r7zs"] Apr 20 10:01:41.122268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.122239 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vs775"] Apr 20 10:01:41.122359 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.122349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:41.122467 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:41.122440 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:41.122853 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.122833 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gm5vg"] Apr 20 10:01:41.122937 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.122915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:41.123000 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:41.122984 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:41.214624 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.214604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:01:41.215005 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.214980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" event={"ID":"deaf1642-316d-4307-8ade-dc653dd9e116","Type":"ContainerStarted","Data":"f45ed85423f85dca64c8205ea580439e1ed34e098b7c7387c454b885159e544e"} Apr 20 10:01:41.215285 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.215267 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:41.216863 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.216838 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="1c0ccca67a74402ad476d7c4a9ae4af88e8214632c7f955bd2b74b9628c88edc" exitCode=0 Apr 20 10:01:41.216951 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.216921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:41.216951 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.216932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"1c0ccca67a74402ad476d7c4a9ae4af88e8214632c7f955bd2b74b9628c88edc"} Apr 20 10:01:41.217117 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:41.217098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:41.230603 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.230586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:01:41.247397 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:41.247362 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" podStartSLOduration=9.950308451 podStartE2EDuration="27.247351235s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.640632352 +0000 UTC m=+3.063693476" lastFinishedPulling="2026-04-20 10:01:33.937675133 +0000 UTC m=+20.360736260" observedRunningTime="2026-04-20 10:01:41.245898616 +0000 UTC m=+27.668959759" watchObservedRunningTime="2026-04-20 10:01:41.247351235 +0000 UTC m=+27.670412375" Apr 20 10:01:42.220453 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:42.220387 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="3e93413bd0cd8df777a977ac22995c5274b8851f4347f1a506ffc1a2618111ca" exitCode=0 Apr 20 10:01:42.220453 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:42.220441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"3e93413bd0cd8df777a977ac22995c5274b8851f4347f1a506ffc1a2618111ca"} Apr 20 10:01:42.540495 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:42.540431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:42.540602 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:42.540525 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:42.540602 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:42.540569 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret podName:9968b560-1fb0-4930-8c96-a8878efe7d90 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:58.540556142 +0000 UTC m=+44.963617263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret") pod "global-pull-secret-syncer-5r7zs" (UID: "9968b560-1fb0-4930-8c96-a8878efe7d90") : object "kube-system"/"original-pull-secret" not registered Apr 20 10:01:43.078291 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:43.078260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:43.078291 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:43.078291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:43.078449 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:43.078362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:43.078449 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:43.078395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:43.078635 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:43.078600 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:43.078779 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:43.078707 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:45.078356 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:45.078314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:45.078994 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:45.078435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gm5vg" podUID="ff96330b-c86e-4eab-8d6f-a6db1b630272" Apr 20 10:01:45.078994 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:45.078445 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:45.078994 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:45.078475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:45.078994 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:45.078538 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5r7zs" podUID="9968b560-1fb0-4930-8c96-a8878efe7d90" Apr 20 10:01:45.078994 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:45.078639 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs775" podUID="6a07ac99-a265-4370-a43b-b11246f741de" Apr 20 10:01:46.916143 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.915952 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-95.ec2.internal" event="NodeReady" Apr 20 10:01:46.916634 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.916256 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 10:01:46.972132 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.972104 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cqrj9"] Apr 20 10:01:46.977366 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.977343 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5fzkf"] Apr 20 10:01:46.977516 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.977497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:46.980051 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.980030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qvhbt\"" Apr 20 10:01:46.980158 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.980103 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 10:01:46.980212 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.980030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 10:01:46.981795 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.981776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:46.984214 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.984195 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 10:01:46.984564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.984529 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 10:01:46.984716 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.984630 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 10:01:46.984813 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.984786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4xmrs\"" Apr 20 10:01:46.986559 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.986540 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqrj9"] Apr 20 10:01:46.989554 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:46.989535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5fzkf"] Apr 20 10:01:47.074606 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm967\" (UniqueName: \"kubernetes.io/projected/84d8b916-498e-4189-840c-c6931e4b0d70-kube-api-access-lm967\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.074780 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2dce1807-1577-4d4f-8a49-740ba99a59ca-tmp-dir\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.074780 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.074780 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dce1807-1577-4d4f-8a49-740ba99a59ca-config-volume\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.074780 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.074948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.074781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vd26\" (UniqueName: \"kubernetes.io/projected/2dce1807-1577-4d4f-8a49-740ba99a59ca-kube-api-access-2vd26\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.077676 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.077640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:47.077798 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.077780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:47.077880 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.077867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:47.081616 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.081595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 10:01:47.081769 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.081749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 10:01:47.081875 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.081857 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:01:47.082153 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.082129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 10:01:47.082699 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.082676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-plq86\"" Apr 20 10:01:47.082796 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.082729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 10:01:47.175848 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dce1807-1577-4d4f-8a49-740ba99a59ca-config-volume\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.175848 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.175848 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vd26\" (UniqueName: \"kubernetes.io/projected/2dce1807-1577-4d4f-8a49-740ba99a59ca-kube-api-access-2vd26\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.175848 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm967\" (UniqueName: \"kubernetes.io/projected/84d8b916-498e-4189-840c-c6931e4b0d70-kube-api-access-lm967\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2dce1807-1577-4d4f-8a49-740ba99a59ca-tmp-dir\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.175907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.175950 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.176002 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.176021 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.676006144 +0000 UTC m=+34.099067265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:01:47.176079 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.176036 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:47.676030186 +0000 UTC m=+34.099091307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:01:47.176328 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.176313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2dce1807-1577-4d4f-8a49-740ba99a59ca-tmp-dir\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.176438 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.176419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dce1807-1577-4d4f-8a49-740ba99a59ca-config-volume\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.187998 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.187980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm967\" (UniqueName: \"kubernetes.io/projected/84d8b916-498e-4189-840c-c6931e4b0d70-kube-api-access-lm967\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.188123 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.187999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vd26\" (UniqueName: \"kubernetes.io/projected/2dce1807-1577-4d4f-8a49-740ba99a59ca-kube-api-access-2vd26\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.680073 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.680038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:47.680238 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.680131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:47.680238 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.680213 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:47.680238 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.680225 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:47.680357 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.680283 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:48.680269241 +0000 UTC m=+35.103330361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:01:47.680357 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.680297 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:01:48.680289511 +0000 UTC m=+35.103350632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:01:47.781098 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.781060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:47.783673 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.783642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz858\" (UniqueName: \"kubernetes.io/projected/ff96330b-c86e-4eab-8d6f-a6db1b630272-kube-api-access-xz858\") pod \"network-check-target-gm5vg\" (UID: \"ff96330b-c86e-4eab-8d6f-a6db1b630272\") " pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:47.882223 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.882196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:01:47.882313 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.882280 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 10:01:47.882361 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:47.882325 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs podName:6a07ac99-a265-4370-a43b-b11246f741de nodeName:}" failed. No retries permitted until 2026-04-20 10:02:19.882312229 +0000 UTC m=+66.305373350 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs") pod "network-metrics-daemon-vs775" (UID: "6a07ac99-a265-4370-a43b-b11246f741de") : secret "metrics-daemon-secret" not found Apr 20 10:01:47.996473 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:47.996423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:48.349241 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:48.349219 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gm5vg"] Apr 20 10:01:48.352435 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:48.352400 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff96330b_c86e_4eab_8d6f_a6db1b630272.slice/crio-dc67f38b1e2aaae043d3a8a94068d57843d92877c32d3783fcc59c3713fdfe0b WatchSource:0}: Error finding container dc67f38b1e2aaae043d3a8a94068d57843d92877c32d3783fcc59c3713fdfe0b: Status 404 returned error can't find the container with id dc67f38b1e2aaae043d3a8a94068d57843d92877c32d3783fcc59c3713fdfe0b Apr 20 10:01:48.688282 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:48.688109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:48.688379 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:48.688318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:48.688379 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:48.688254 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:48.688445 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:48.688407 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:01:50.688391846 +0000 UTC m=+37.111452998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:01:48.688445 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:48.688421 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:48.688519 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:48.688467 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:50.688453636 +0000 UTC m=+37.111514756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:01:49.237165 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:49.237112 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="71c5b03bcf34a0e1977f9fa8212c3a678515fdb3c0728fbb378ea82b76caaeba" exitCode=0 Apr 20 10:01:49.237605 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:49.237194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"71c5b03bcf34a0e1977f9fa8212c3a678515fdb3c0728fbb378ea82b76caaeba"} Apr 20 10:01:49.238494 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:49.238475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gm5vg" event={"ID":"ff96330b-c86e-4eab-8d6f-a6db1b630272","Type":"ContainerStarted","Data":"dc67f38b1e2aaae043d3a8a94068d57843d92877c32d3783fcc59c3713fdfe0b"} Apr 20 10:01:50.242411 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:50.242383 2577 generic.go:358] "Generic (PLEG): container finished" podID="ada0f1d6-3214-4751-9778-3af57b7e44c0" containerID="700bbee31d38e0d182dc4afd282c0330807a1489345d5dd7a4d51042cd239dff" exitCode=0 Apr 20 10:01:50.242830 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:50.242422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerDied","Data":"700bbee31d38e0d182dc4afd282c0330807a1489345d5dd7a4d51042cd239dff"} Apr 20 10:01:50.705854 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:50.705812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:50.706012 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:50.705906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:50.706012 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:50.705988 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:50.706012 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:50.706003 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:50.706163 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:50.706072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:01:54.706049818 +0000 UTC m=+41.129110961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:01:50.706163 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:50.706093 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:54.706084023 +0000 UTC m=+41.129145147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:01:51.247974 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.247934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bp75" event={"ID":"ada0f1d6-3214-4751-9778-3af57b7e44c0","Type":"ContainerStarted","Data":"044683337a777d2b7982d6ffc6ac01713a28bebae8ac9ac4c381d39f34437380"} Apr 20 10:01:51.276451 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.276351 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4bp75" podStartSLOduration=5.727990588 podStartE2EDuration="37.276332487s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:16.637269072 +0000 UTC m=+3.060330193" lastFinishedPulling="2026-04-20 10:01:48.185610967 +0000 UTC m=+34.608672092" observedRunningTime="2026-04-20 10:01:51.272395706 +0000 UTC m=+37.695456851" watchObservedRunningTime="2026-04-20 10:01:51.276332487 +0000 UTC m=+37.699393633" Apr 20 10:01:51.740791 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.740763 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2zgz2"] Apr 20 10:01:51.753163 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.753146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:51.754899 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.754875 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2zgz2"] Apr 20 10:01:51.755972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.755945 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.756117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.756096 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pkhsg\"" Apr 20 10:01:51.756161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.756106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 10:01:51.756161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.756155 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.756346 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.756328 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 10:01:51.770824 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.770807 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 10:01:51.844397 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.844345 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k"] Apr 20 10:01:51.849251 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.849234 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b"] Apr 20 10:01:51.849419 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.849401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" Apr 20 10:01:51.852239 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.852220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.855221 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.852600 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-dkvck\"" Apr 20 10:01:51.855708 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.855691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.861482 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.859798 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx"] Apr 20 10:01:51.868222 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.868172 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg"] Apr 20 10:01:51.868333 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.868244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:51.868333 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.868297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:51.871335 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.871313 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 10:01:51.871423 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.871337 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 10:01:51.871423 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.871370 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.871423 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.871410 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.872701 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.872681 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.872803 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.872710 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4w8k2\"" Apr 20 10:01:51.872803 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.872763 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 10:01:51.872922 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.872687 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.872922 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.872822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mn8pk\"" Apr 20 10:01:51.881047 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.880929 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58f946d5-ggd48"] Apr 20 10:01:51.881119 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.881056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" Apr 20 10:01:51.883805 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.883787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tm9vs\"" Apr 20 10:01:51.893457 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893440 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k"] Apr 20 10:01:51.893529 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b"] Apr 20 10:01:51.893529 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg"] Apr 20 10:01:51.893529 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893480 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx"] Apr 20 10:01:51.893529 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893488 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58f946d5-ggd48"] Apr 20 10:01:51.893647 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.893557 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:51.896367 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.896353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 10:01:51.896695 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.896673 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 10:01:51.897006 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.896991 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6bmx\"" Apr 20 10:01:51.897256 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.897242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 10:01:51.901909 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.901888 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 10:01:51.914240 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.914225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-config\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:51.914347 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.914306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjgk\" (UniqueName: \"kubernetes.io/projected/db045f44-d582-4037-82eb-d656372b093e-kube-api-access-grjgk\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:51.914347 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.914350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db045f44-d582-4037-82eb-d656372b093e-serving-cert\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:51.914711 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.914377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-trusted-ca\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:51.937592 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.937572 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4"] Apr 20 10:01:51.953748 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.953723 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf"] Apr 20 10:01:51.953864 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.953848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:51.956303 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.956284 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 10:01:51.956399 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.956289 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9pq6q\"" Apr 20 10:01:51.956399 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.956361 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.956502 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.956418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 10:01:51.956557 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.956508 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.965762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.965745 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5fn5"] Apr 20 10:01:51.965895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.965872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:51.969342 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.969325 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-pfjwg\"" Apr 20 10:01:51.970015 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.969393 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.970110 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.969470 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 10:01:51.970171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.969759 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.970234 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.969998 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 10:01:51.986513 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.986492 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4"] Apr 20 10:01:51.986635 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.986622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:51.989096 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.989076 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 10:01:51.989207 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.989192 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 10:01:51.989379 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.989344 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 10:01:51.989482 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.989412 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 10:01:51.989561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.989506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-txgvw\"" Apr 20 10:01:51.994095 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:51.994075 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 10:01:52.010781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.010764 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dbc9c6698-7l896"] Apr 20 10:01:52.010963 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.010902 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.013628 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.013602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 10:01:52.013628 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.013624 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 10:01:52.013801 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.013688 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hf5zq\"" Apr 20 10:01:52.015260 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015240 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52br\" (UniqueName: \"kubernetes.io/projected/e27ef370-a030-44aa-a961-156382685e11-kube-api-access-w52br\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.015372 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grjgk\" (UniqueName: \"kubernetes.io/projected/db045f44-d582-4037-82eb-d656372b093e-kube-api-access-grjgk\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.015372 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015372 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015524 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjdz7\" (UniqueName: \"kubernetes.io/projected/5cd2cf42-4b4b-4260-963f-fd7f94555d35-kube-api-access-sjdz7\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.015524 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015484 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcn9\" (UniqueName: \"kubernetes.io/projected/ae249ad7-50d4-4db6-be40-535b35542e1c-kube-api-access-sjcn9\") pod \"volume-data-source-validator-7c6cbb6c87-djf8k\" (UID: \"ae249ad7-50d4-4db6-be40-535b35542e1c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" Apr 20 10:01:52.015627 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxj4\" (UniqueName: \"kubernetes.io/projected/73ef9bd2-5eeb-4e74-9dfe-17214e80e475-kube-api-access-ktxj4\") pod \"network-check-source-8894fc9bd-4dcjg\" (UID: \"73ef9bd2-5eeb-4e74-9dfe-17214e80e475\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" Apr 20 10:01:52.015627 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015627 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbzd\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015797 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db045f44-d582-4037-82eb-d656372b093e-serving-cert\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.015797 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-trusted-ca\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.015797 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-config\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015927 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.015948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5cd2cf42-4b4b-4260-963f-fd7f94555d35-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.016255 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.015970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.016534 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.016511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-config\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.016735 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.016579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db045f44-d582-4037-82eb-d656372b093e-trusted-ca\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.019405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.019385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db045f44-d582-4037-82eb-d656372b093e-serving-cert\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.024378 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.024360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjgk\" (UniqueName: \"kubernetes.io/projected/db045f44-d582-4037-82eb-d656372b093e-kube-api-access-grjgk\") pod \"console-operator-9d4b6777b-2zgz2\" (UID: \"db045f44-d582-4037-82eb-d656372b093e\") " pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.035618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4"] Apr 20 10:01:52.035726 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035621 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf"] Apr 20 10:01:52.035726 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035635 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5fn5"] Apr 20 10:01:52.035726 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035647 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dbc9c6698-7l896"] Apr 20 10:01:52.035726 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035682 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4"] Apr 20 10:01:52.035894 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.035730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.038617 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.038593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 10:01:52.038740 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.038641 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 10:01:52.038740 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.038641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 10:01:52.038740 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.038731 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rzqv4\"" Apr 20 10:01:52.038901 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.038748 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 10:01:52.039059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.039043 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 10:01:52.039230 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.039206 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 10:01:52.062410 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.062388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:01:52.122709 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.122845 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w52br\" (UniqueName: \"kubernetes.io/projected/e27ef370-a030-44aa-a961-156382685e11-kube-api-access-w52br\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.122845 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc45p\" (UniqueName: \"kubernetes.io/projected/9cedfb03-6b25-46db-a934-2933c2d42473-kube-api-access-nc45p\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.122845 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.122777 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 10:01:52.122845 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.122845 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5jst\" (UniqueName: \"kubernetes.io/projected/289379f5-7b90-499a-a7cd-14690b1bb4b1-kube-api-access-j5jst\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.123066 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.122851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289379f5-7b90-499a-a7cd-14690b1bb4b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.123066 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.122888 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls podName:e27ef370-a030-44aa-a961-156382685e11 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.622868164 +0000 UTC m=+39.045929287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k822b" (UID: "e27ef370-a030-44aa-a961-156382685e11") : secret "samples-operator-tls" not found Apr 20 10:01:52.123161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv4f\" (UniqueName: \"kubernetes.io/projected/81c6d571-c228-4b53-8d5e-c96359b3d8f6-kube-api-access-4cv4f\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.123161 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.123294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcn9\" (UniqueName: \"kubernetes.io/projected/ae249ad7-50d4-4db6-be40-535b35542e1c-kube-api-access-sjcn9\") pod \"volume-data-source-validator-7c6cbb6c87-djf8k\" (UID: \"ae249ad7-50d4-4db6-be40-535b35542e1c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" Apr 20 10:01:52.123294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123294 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289379f5-7b90-499a-a7cd-14690b1bb4b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a97f42a-851b-4803-9be6-3ad666e6f307-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbzd\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-snapshots\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123403 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-tmp\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cedfb03-6b25-46db-a934-2933c2d42473-serving-cert\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.123518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.123498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxj4\" (UniqueName: \"kubernetes.io/projected/73ef9bd2-5eeb-4e74-9dfe-17214e80e475-kube-api-access-ktxj4\") pod \"network-check-source-8894fc9bd-4dcjg\" (UID: \"73ef9bd2-5eeb-4e74-9dfe-17214e80e475\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" Apr 20 10:01:52.124135 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.124046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.124135 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.124072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.124135 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.124095 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:52.124135 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.124114 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:01:52.124357 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.124208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5cd2cf42-4b4b-4260-963f-fd7f94555d35-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.124357 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.124239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c6d571-c228-4b53-8d5e-c96359b3d8f6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.124357 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.124264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.125078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.125213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6d571-c228-4b53-8d5e-c96359b3d8f6-config\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.125241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjdz7\" (UniqueName: \"kubernetes.io/projected/5cd2cf42-4b4b-4260-963f-fd7f94555d35-kube-api-access-sjdz7\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.125483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.125556 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.125605 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.62559171 +0000 UTC m=+39.048652845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.125637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.126088 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.126000 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.625986366 +0000 UTC m=+39.049047508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:01:52.129242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.129195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5cd2cf42-4b4b-4260-963f-fd7f94555d35-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.135773 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.135731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcn9\" (UniqueName: \"kubernetes.io/projected/ae249ad7-50d4-4db6-be40-535b35542e1c-kube-api-access-sjcn9\") pod \"volume-data-source-validator-7c6cbb6c87-djf8k\" (UID: \"ae249ad7-50d4-4db6-be40-535b35542e1c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" Apr 20 10:01:52.139117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.138834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52br\" (UniqueName: \"kubernetes.io/projected/e27ef370-a030-44aa-a961-156382685e11-kube-api-access-w52br\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.139117 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.139077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxj4\" (UniqueName: \"kubernetes.io/projected/73ef9bd2-5eeb-4e74-9dfe-17214e80e475-kube-api-access-ktxj4\") pod \"network-check-source-8894fc9bd-4dcjg\" (UID: \"73ef9bd2-5eeb-4e74-9dfe-17214e80e475\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" Apr 20 10:01:52.141026 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.140965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.141108 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.141087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjdz7\" (UniqueName: \"kubernetes.io/projected/5cd2cf42-4b4b-4260-963f-fd7f94555d35-kube-api-access-sjdz7\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.141202 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.141178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbzd\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.141927 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.141888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.145264 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.145222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.147332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.147305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.163092 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.162791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" Apr 20 10:01:52.189506 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.189465 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" Apr 20 10:01:52.212532 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.212501 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2zgz2"] Apr 20 10:01:52.217918 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:52.217880 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb045f44_d582_4037_82eb_d656372b093e.slice/crio-ceadad452a5078d7eab7b4e3291e87272dd97a70840f1f3973590f5da667e294 WatchSource:0}: Error finding container ceadad452a5078d7eab7b4e3291e87272dd97a70840f1f3973590f5da667e294: Status 404 returned error can't find the container with id ceadad452a5078d7eab7b4e3291e87272dd97a70840f1f3973590f5da667e294 Apr 20 10:01:52.226234 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6d571-c228-4b53-8d5e-c96359b3d8f6-config\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.226345 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.226345 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.226444 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-stats-auth\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.226444 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc45p\" (UniqueName: \"kubernetes.io/projected/9cedfb03-6b25-46db-a934-2933c2d42473-kube-api-access-nc45p\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.226444 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jst\" (UniqueName: \"kubernetes.io/projected/289379f5-7b90-499a-a7cd-14690b1bb4b1-kube-api-access-j5jst\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289379f5-7b90-499a-a7cd-14690b1bb4b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv4f\" (UniqueName: \"kubernetes.io/projected/81c6d571-c228-4b53-8d5e-c96359b3d8f6-kube-api-access-4cv4f\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289379f5-7b90-499a-a7cd-14690b1bb4b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a97f42a-851b-4803-9be6-3ad666e6f307-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-default-certificate\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.226840 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-snapshots\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.227242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.227242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-tmp\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.227242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cedfb03-6b25-46db-a934-2933c2d42473-serving-cert\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.227242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c6d571-c228-4b53-8d5e-c96359b3d8f6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.227242 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.226994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95km\" (UniqueName: \"kubernetes.io/projected/adc46263-5b99-4162-8415-8a084543bdad-kube-api-access-n95km\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.227550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c6d571-c228-4b53-8d5e-c96359b3d8f6-config\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.227652 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.227751 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.727732407 +0000 UTC m=+39.150793530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.228841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.229202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289379f5-7b90-499a-a7cd-14690b1bb4b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.229365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-snapshots\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.229631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cedfb03-6b25-46db-a934-2933c2d42473-tmp\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.230187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cedfb03-6b25-46db-a934-2933c2d42473-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.230621 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.230636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a97f42a-851b-4803-9be6-3ad666e6f307-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.232779 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.232739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289379f5-7b90-499a-a7cd-14690b1bb4b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.233055 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.233012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cedfb03-6b25-46db-a934-2933c2d42473-serving-cert\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.233146 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.233126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c6d571-c228-4b53-8d5e-c96359b3d8f6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.236498 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.236473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc45p\" (UniqueName: \"kubernetes.io/projected/9cedfb03-6b25-46db-a934-2933c2d42473-kube-api-access-nc45p\") pod \"insights-operator-585dfdc468-s5fn5\" (UID: \"9cedfb03-6b25-46db-a934-2933c2d42473\") " pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.236738 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.236716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jst\" (UniqueName: \"kubernetes.io/projected/289379f5-7b90-499a-a7cd-14690b1bb4b1-kube-api-access-j5jst\") pod \"kube-storage-version-migrator-operator-6769c5d45-dk6lf\" (UID: \"289379f5-7b90-499a-a7cd-14690b1bb4b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.237673 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.237636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv4f\" (UniqueName: \"kubernetes.io/projected/81c6d571-c228-4b53-8d5e-c96359b3d8f6-kube-api-access-4cv4f\") pod \"service-ca-operator-d6fc45fc5-79qp4\" (UID: \"81c6d571-c228-4b53-8d5e-c96359b3d8f6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.254553 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.253789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gm5vg" event={"ID":"ff96330b-c86e-4eab-8d6f-a6db1b630272","Type":"ContainerStarted","Data":"67d6a7bedc2871276340d5d1282673ac29baea80c9ab7a0b7e36092ea3b6114d"} Apr 20 10:01:52.254553 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.254057 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:01:52.256324 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.256239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" event={"ID":"db045f44-d582-4037-82eb-d656372b093e","Type":"ContainerStarted","Data":"ceadad452a5078d7eab7b4e3291e87272dd97a70840f1f3973590f5da667e294"} Apr 20 10:01:52.262935 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.262910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" Apr 20 10:01:52.275684 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.275422 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gm5vg" podStartSLOduration=35.034990044 podStartE2EDuration="38.275402986s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:01:48.354751556 +0000 UTC m=+34.777812681" lastFinishedPulling="2026-04-20 10:01:51.595164502 +0000 UTC m=+38.018225623" observedRunningTime="2026-04-20 10:01:52.274378715 +0000 UTC m=+38.697439856" watchObservedRunningTime="2026-04-20 10:01:52.275402986 +0000 UTC m=+38.698464132" Apr 20 10:01:52.276126 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.275858 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" Apr 20 10:01:52.296349 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.295183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" Apr 20 10:01:52.299758 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.297307 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k"] Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-stats-auth\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328200 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg"] Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.328354 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.828331709 +0000 UTC m=+39.251392833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-default-certificate\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.328557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n95km\" (UniqueName: \"kubernetes.io/projected/adc46263-5b99-4162-8415-8a084543bdad-kube-api-access-n95km\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.329425 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:01:52.329767 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.329475 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:52.829458819 +0000 UTC m=+39.252519942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:01:52.333555 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.332827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-default-certificate\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.333926 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.333882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-stats-auth\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.340997 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.340939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95km\" (UniqueName: \"kubernetes.io/projected/adc46263-5b99-4162-8415-8a084543bdad-kube-api-access-n95km\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.414408 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.414296 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4"] Apr 20 10:01:52.416447 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:52.416421 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c6d571_c228_4b53_8d5e_c96359b3d8f6.slice/crio-0565006e139c5c419183019dd98e9cfbac707f70123c2d32dd03d3e8df231770 WatchSource:0}: Error finding container 0565006e139c5c419183019dd98e9cfbac707f70123c2d32dd03d3e8df231770: Status 404 returned error can't find the container with id 0565006e139c5c419183019dd98e9cfbac707f70123c2d32dd03d3e8df231770 Apr 20 10:01:52.423387 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.423365 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf"] Apr 20 10:01:52.426381 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:52.426360 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289379f5_7b90_499a_a7cd_14690b1bb4b1.slice/crio-64d3ead9a3e89f4e366f55c886f72c27a909bf6152fff9041c8354949d0c0fa9 WatchSource:0}: Error finding container 64d3ead9a3e89f4e366f55c886f72c27a909bf6152fff9041c8354949d0c0fa9: Status 404 returned error can't find the container with id 64d3ead9a3e89f4e366f55c886f72c27a909bf6152fff9041c8354949d0c0fa9 Apr 20 10:01:52.442431 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.442402 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5fn5"] Apr 20 10:01:52.445252 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:52.445233 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cedfb03_6b25_46db_a934_2933c2d42473.slice/crio-e65217f43982ec28b0707552fc2bfec9399cfc1d02c4304b1186cbc6e8040305 WatchSource:0}: Error finding container e65217f43982ec28b0707552fc2bfec9399cfc1d02c4304b1186cbc6e8040305: Status 404 returned error can't find the container with id e65217f43982ec28b0707552fc2bfec9399cfc1d02c4304b1186cbc6e8040305 Apr 20 10:01:52.631228 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.631166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:52.631228 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.631218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:52.631363 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.631276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:52.631363 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631339 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:52.631426 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631386 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 10:01:52.631426 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631388 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:52.631426 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631405 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:01:52.631426 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631412 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.631389296 +0000 UTC m=+40.054450421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:52.631561 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631435 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls podName:e27ef370-a030-44aa-a961-156382685e11 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.63141992 +0000 UTC m=+40.054481062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k822b" (UID: "e27ef370-a030-44aa-a961-156382685e11") : secret "samples-operator-tls" not found Apr 20 10:01:52.631561 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.631450 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.631443443 +0000 UTC m=+40.054504564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:01:52.732407 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.732375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:52.732522 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.732501 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:01:52.732558 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.732549 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.732537726 +0000 UTC m=+40.155598846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:01:52.833768 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.833739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.833899 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:52.833824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:52.833994 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.833981 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.833965446 +0000 UTC m=+40.257026582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:01:52.834070 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.834058 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:01:52.834117 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:52.834092 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:53.834082079 +0000 UTC m=+40.257143207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:01:53.262049 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.262011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" event={"ID":"9cedfb03-6b25-46db-a934-2933c2d42473","Type":"ContainerStarted","Data":"e65217f43982ec28b0707552fc2bfec9399cfc1d02c4304b1186cbc6e8040305"} Apr 20 10:01:53.266200 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.266120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" event={"ID":"289379f5-7b90-499a-a7cd-14690b1bb4b1","Type":"ContainerStarted","Data":"64d3ead9a3e89f4e366f55c886f72c27a909bf6152fff9041c8354949d0c0fa9"} Apr 20 10:01:53.271048 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.270313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" event={"ID":"73ef9bd2-5eeb-4e74-9dfe-17214e80e475","Type":"ContainerStarted","Data":"3ce1c49eca53eac1bdbefc9e6f7c51a022f45a27d4017e980abdd3d4cdf4ea75"} Apr 20 10:01:53.271048 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.270342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" event={"ID":"73ef9bd2-5eeb-4e74-9dfe-17214e80e475","Type":"ContainerStarted","Data":"e4ba792fea5256aef79dba4461efe86ad322f10a0659b582e340e4c1f817df8d"} Apr 20 10:01:53.273807 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.273751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" event={"ID":"ae249ad7-50d4-4db6-be40-535b35542e1c","Type":"ContainerStarted","Data":"b2182381c7f707c1264b90c959e438b7739600e9e84a138ba634e6a85d321f21"} Apr 20 10:01:53.276697 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.276627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" event={"ID":"81c6d571-c228-4b53-8d5e-c96359b3d8f6","Type":"ContainerStarted","Data":"0565006e139c5c419183019dd98e9cfbac707f70123c2d32dd03d3e8df231770"} Apr 20 10:01:53.290001 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.288856 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4dcjg" podStartSLOduration=2.288841161 podStartE2EDuration="2.288841161s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:53.288184413 +0000 UTC m=+39.711245557" watchObservedRunningTime="2026-04-20 10:01:53.288841161 +0000 UTC m=+39.711902307" Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.656176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.656277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.656339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656508 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls podName:e27ef370-a030-44aa-a961-156382685e11 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.656552819 +0000 UTC m=+42.079613947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k822b" (UID: "e27ef370-a030-44aa-a961-156382685e11") : secret "samples-operator-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656638 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656648 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656700 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.656687801 +0000 UTC m=+42.079748939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656757 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:53.657171 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.656787 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.656777071 +0000 UTC m=+42.079838196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:53.757626 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.757095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:53.757626 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.757368 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:01:53.757626 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.757430 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.757411527 +0000 UTC m=+42.180472651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:01:53.859011 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.858172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:53.859011 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:53.858318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:53.859011 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.858457 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:01:53.859011 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.858560 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.858541373 +0000 UTC m=+42.281602499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:01:53.859011 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:53.858958 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:55.858942874 +0000 UTC m=+42.282004000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:01:54.776012 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:54.775976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:01:54.776388 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:54.776112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:01:54.776388 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:54.776174 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:54.776388 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:54.776230 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:54.776388 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:54.776245 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:02:02.776227612 +0000 UTC m=+49.199288739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:01:54.776388 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:54.776280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:02.776264392 +0000 UTC m=+49.199325519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:01:55.684525 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.684489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:55.684710 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.684592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:55.684710 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684634 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:55.684710 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684653 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:01:55.684710 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.684682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:55.684874 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684728 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.684710343 +0000 UTC m=+46.107771486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:01:55.684874 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684758 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:55.684874 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684773 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 10:01:55.684874 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684840 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls podName:e27ef370-a030-44aa-a961-156382685e11 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.684828381 +0000 UTC m=+46.107889519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k822b" (UID: "e27ef370-a030-44aa-a961-156382685e11") : secret "samples-operator-tls" not found Apr 20 10:01:55.684874 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.684852 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.684846982 +0000 UTC m=+46.107908103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:55.785266 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.785230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:55.785713 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.785346 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:01:55.785713 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.785419 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.785399911 +0000 UTC m=+46.208461051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:01:55.886611 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.886577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:55.886757 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.886717 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.88669744 +0000 UTC m=+46.309758561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:01:55.886813 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:55.886775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:55.886867 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.886850 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:01:55.886902 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:55.886880 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:01:59.886872874 +0000 UTC m=+46.309933995 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:01:58.291470 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.291446 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/0.log" Apr 20 10:01:58.291929 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.291484 2577 generic.go:358] "Generic (PLEG): container finished" podID="db045f44-d582-4037-82eb-d656372b093e" containerID="dbb2e7f9028a9d039e414d00367327c1a40ec0af019bdeae49821ecad3583473" exitCode=255 Apr 20 10:01:58.291929 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.291515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" event={"ID":"db045f44-d582-4037-82eb-d656372b093e","Type":"ContainerDied","Data":"dbb2e7f9028a9d039e414d00367327c1a40ec0af019bdeae49821ecad3583473"} Apr 20 10:01:58.291929 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.291792 2577 scope.go:117] "RemoveContainer" containerID="dbb2e7f9028a9d039e414d00367327c1a40ec0af019bdeae49821ecad3583473" Apr 20 10:01:58.293148 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.293105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" event={"ID":"ae249ad7-50d4-4db6-be40-535b35542e1c","Type":"ContainerStarted","Data":"d9b3374eb46a4f59c80eda4279f7da02ffeb64d377616b52fb9d9ce383867426"} Apr 20 10:01:58.294622 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.294599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" event={"ID":"81c6d571-c228-4b53-8d5e-c96359b3d8f6","Type":"ContainerStarted","Data":"15a46ca0dbfee901d60bed390d5aee7d63535e793baf872b34188aebf3181b90"} Apr 20 10:01:58.296183 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.296156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" event={"ID":"9cedfb03-6b25-46db-a934-2933c2d42473","Type":"ContainerStarted","Data":"4172cbda1740a915afb9a5341737b6bf261360f5ed99ef70ecfba4be80130fe7"} Apr 20 10:01:58.297611 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.297591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" event={"ID":"289379f5-7b90-499a-a7cd-14690b1bb4b1","Type":"ContainerStarted","Data":"4ed22101c1a8516f350a2206bd7f405166fa65e4e68c269a878b88fec7116caa"} Apr 20 10:01:58.360440 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.360393 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" podStartSLOduration=2.56335555 podStartE2EDuration="7.360377833s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:01:52.418878137 +0000 UTC m=+38.841939258" lastFinishedPulling="2026-04-20 10:01:57.21590042 +0000 UTC m=+43.638961541" observedRunningTime="2026-04-20 10:01:58.359705653 +0000 UTC m=+44.782766810" watchObservedRunningTime="2026-04-20 10:01:58.360377833 +0000 UTC m=+44.783438977" Apr 20 10:01:58.391804 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.391760 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" podStartSLOduration=2.617707348 podStartE2EDuration="7.391745968s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:01:52.446799366 +0000 UTC m=+38.869860486" lastFinishedPulling="2026-04-20 10:01:57.220837971 +0000 UTC m=+43.643899106" observedRunningTime="2026-04-20 10:01:58.390537608 +0000 UTC m=+44.813598753" watchObservedRunningTime="2026-04-20 10:01:58.391745968 +0000 UTC m=+44.814807114" Apr 20 10:01:58.421514 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.420879 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-djf8k" podStartSLOduration=2.51101195 podStartE2EDuration="7.420853184s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:01:52.30588113 +0000 UTC m=+38.728942256" lastFinishedPulling="2026-04-20 10:01:57.215722354 +0000 UTC m=+43.638783490" observedRunningTime="2026-04-20 10:01:58.419587326 +0000 UTC m=+44.842648480" watchObservedRunningTime="2026-04-20 10:01:58.420853184 +0000 UTC m=+44.843914328" Apr 20 10:01:58.442789 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.442247 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" podStartSLOduration=2.654491357 podStartE2EDuration="7.442230423s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:01:52.428191416 +0000 UTC m=+38.851252537" lastFinishedPulling="2026-04-20 10:01:57.215930475 +0000 UTC m=+43.638991603" observedRunningTime="2026-04-20 10:01:58.441569857 +0000 UTC m=+44.864631000" watchObservedRunningTime="2026-04-20 10:01:58.442230423 +0000 UTC m=+44.865291570" Apr 20 10:01:58.612316 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.612290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:58.615528 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.615499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9968b560-1fb0-4930-8c96-a8878efe7d90-original-pull-secret\") pod \"global-pull-secret-syncer-5r7zs\" (UID: \"9968b560-1fb0-4930-8c96-a8878efe7d90\") " pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:58.727245 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.727221 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb"] Apr 20 10:01:58.740312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.740290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" Apr 20 10:01:58.742995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.742976 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb"] Apr 20 10:01:58.743841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.743571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:58.743841 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.743769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-k6j8x\"" Apr 20 10:01:58.744419 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.744391 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 10:01:58.790041 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.790011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5r7zs" Apr 20 10:01:58.814026 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.813999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gscvm\" (UniqueName: \"kubernetes.io/projected/9701f42d-6084-4aa6-9f1d-845738e47a33-kube-api-access-gscvm\") pod \"migrator-74bb7799d9-zjnpb\" (UID: \"9701f42d-6084-4aa6-9f1d-845738e47a33\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" Apr 20 10:01:58.914229 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.914209 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5r7zs"] Apr 20 10:01:58.914711 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.914686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gscvm\" (UniqueName: \"kubernetes.io/projected/9701f42d-6084-4aa6-9f1d-845738e47a33-kube-api-access-gscvm\") pod \"migrator-74bb7799d9-zjnpb\" (UID: \"9701f42d-6084-4aa6-9f1d-845738e47a33\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" Apr 20 10:01:58.916863 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:58.916834 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9968b560_1fb0_4930_8c96_a8878efe7d90.slice/crio-1e1159f0f376d35405ddfdb071055a68df86fd655de95815d5afa09ff65979e5 WatchSource:0}: Error finding container 1e1159f0f376d35405ddfdb071055a68df86fd655de95815d5afa09ff65979e5: Status 404 returned error can't find the container with id 1e1159f0f376d35405ddfdb071055a68df86fd655de95815d5afa09ff65979e5 Apr 20 10:01:58.923397 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:58.923372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gscvm\" (UniqueName: \"kubernetes.io/projected/9701f42d-6084-4aa6-9f1d-845738e47a33-kube-api-access-gscvm\") pod \"migrator-74bb7799d9-zjnpb\" (UID: \"9701f42d-6084-4aa6-9f1d-845738e47a33\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" Apr 20 10:01:59.058356 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.058329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" Apr 20 10:01:59.182769 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.182747 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb"] Apr 20 10:01:59.184777 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:01:59.184746 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9701f42d_6084_4aa6_9f1d_845738e47a33.slice/crio-aac58b993731c6c34c2882067eb8d8cf5e1ec02c7dec071ed01fa7ef989eac29 WatchSource:0}: Error finding container aac58b993731c6c34c2882067eb8d8cf5e1ec02c7dec071ed01fa7ef989eac29: Status 404 returned error can't find the container with id aac58b993731c6c34c2882067eb8d8cf5e1ec02c7dec071ed01fa7ef989eac29 Apr 20 10:01:59.300828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.300802 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/1.log" Apr 20 10:01:59.301185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.301170 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/0.log" Apr 20 10:01:59.301229 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.301210 2577 generic.go:358] "Generic (PLEG): container finished" podID="db045f44-d582-4037-82eb-d656372b093e" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" exitCode=255 Apr 20 10:01:59.301289 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.301273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" event={"ID":"db045f44-d582-4037-82eb-d656372b093e","Type":"ContainerDied","Data":"fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a"} Apr 20 10:01:59.301340 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.301327 2577 scope.go:117] "RemoveContainer" containerID="dbb2e7f9028a9d039e414d00367327c1a40ec0af019bdeae49821ecad3583473" Apr 20 10:01:59.301503 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.301487 2577 scope.go:117] "RemoveContainer" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" Apr 20 10:01:59.301726 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.301705 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:01:59.302762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.302407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5r7zs" event={"ID":"9968b560-1fb0-4930-8c96-a8878efe7d90","Type":"ContainerStarted","Data":"1e1159f0f376d35405ddfdb071055a68df86fd655de95815d5afa09ff65979e5"} Apr 20 10:01:59.303476 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.303447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" event={"ID":"9701f42d-6084-4aa6-9f1d-845738e47a33","Type":"ContainerStarted","Data":"aac58b993731c6c34c2882067eb8d8cf5e1ec02c7dec071ed01fa7ef989eac29"} Apr 20 10:01:59.721600 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.721570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:01:59.721799 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.721649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:01:59.721799 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.721741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:01:59.721799 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721756 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:01:59.721799 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721764 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:59.721799 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721778 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:01:59.722059 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721839 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.721820115 +0000 UTC m=+54.144881242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:01:59.722059 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721857 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.721847073 +0000 UTC m=+54.144908200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:01:59.722059 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721906 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 10:01:59.722059 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.721971 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls podName:e27ef370-a030-44aa-a961-156382685e11 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.721954105 +0000 UTC m=+54.145015229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k822b" (UID: "e27ef370-a030-44aa-a961-156382685e11") : secret "samples-operator-tls" not found Apr 20 10:01:59.823093 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.823063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:01:59.823226 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.823198 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:01:59.823277 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.823264 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.823245363 +0000 UTC m=+54.246306502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:01:59.924027 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.924000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:59.924172 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.924158 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.92414434 +0000 UTC m=+54.347205473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:01:59.924439 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:01:59.924424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:01:59.924526 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.924514 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:01:59.924561 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:01:59.924549 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.924539976 +0000 UTC m=+54.347601123 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:02:00.309636 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:00.309611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/1.log" Apr 20 10:02:00.310073 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:00.310053 2577 scope.go:117] "RemoveContainer" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" Apr 20 10:02:00.310307 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:00.310284 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:02:01.255647 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:01.255625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fr996_a533ff84-2d61-4ccb-9b58-1eea4acb387d/dns-node-resolver/0.log" Apr 20 10:02:01.314724 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:01.314689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" event={"ID":"9701f42d-6084-4aa6-9f1d-845738e47a33","Type":"ContainerStarted","Data":"23f0a2a92e12f59466fd8cdd97fd1109a91973aa560dcf0b78959eb9a92f13fd"} Apr 20 10:02:01.315059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:01.314731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" event={"ID":"9701f42d-6084-4aa6-9f1d-845738e47a33","Type":"ContainerStarted","Data":"22b727897ece937b22a519dc4fa2f3e877270e505dc6309f1dd806fa7d49cf75"} Apr 20 10:02:01.336720 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:01.336653 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zjnpb" podStartSLOduration=1.574761603 podStartE2EDuration="3.33663506s" podCreationTimestamp="2026-04-20 10:01:58 +0000 UTC" firstStartedPulling="2026-04-20 10:01:59.186596446 +0000 UTC m=+45.609657567" lastFinishedPulling="2026-04-20 10:02:00.948469889 +0000 UTC m=+47.371531024" observedRunningTime="2026-04-20 10:02:01.334690727 +0000 UTC m=+47.757751872" watchObservedRunningTime="2026-04-20 10:02:01.33663506 +0000 UTC m=+47.759696204" Apr 20 10:02:02.062971 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.062935 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:02.063151 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.062989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:02.063357 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.063337 2577 scope.go:117] "RemoveContainer" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" Apr 20 10:02:02.063539 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:02.063516 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:02:02.657452 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.657427 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zsznm_2eb7affb-8768-41ec-85fb-a62a41bb8709/node-ca/0.log" Apr 20 10:02:02.848984 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.848953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:02.849105 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:02.849043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:02:02.849105 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:02.849089 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:02.849178 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:02.849149 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:02.849178 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:02.849155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls podName:2dce1807-1577-4d4f-8a49-740ba99a59ca nodeName:}" failed. No retries permitted until 2026-04-20 10:02:18.849141957 +0000 UTC m=+65.272203081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls") pod "dns-default-cqrj9" (UID: "2dce1807-1577-4d4f-8a49-740ba99a59ca") : secret "dns-default-metrics-tls" not found Apr 20 10:02:02.849244 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:02.849198 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert podName:84d8b916-498e-4189-840c-c6931e4b0d70 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:18.849185838 +0000 UTC m=+65.272246959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert") pod "ingress-canary-5fzkf" (UID: "84d8b916-498e-4189-840c-c6931e4b0d70") : secret "canary-serving-cert" not found Apr 20 10:02:03.320865 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:03.320788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5r7zs" event={"ID":"9968b560-1fb0-4930-8c96-a8878efe7d90","Type":"ContainerStarted","Data":"5cb978db3e2b9c8a481cc80ee1d275e8c8ba7af4774faacfd3f9a66608cf554a"} Apr 20 10:02:03.349119 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:03.349073 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5r7zs" podStartSLOduration=33.233775593 podStartE2EDuration="37.349059341s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:58.918486631 +0000 UTC m=+45.341547752" lastFinishedPulling="2026-04-20 10:02:03.033770365 +0000 UTC m=+49.456831500" observedRunningTime="2026-04-20 10:02:03.347842117 +0000 UTC m=+49.770903260" watchObservedRunningTime="2026-04-20 10:02:03.349059341 +0000 UTC m=+49.772120484" Apr 20 10:02:03.456024 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:03.455998 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zjnpb_9701f42d-6084-4aa6-9f1d-845738e47a33/migrator/0.log" Apr 20 10:02:03.656314 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:03.656287 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zjnpb_9701f42d-6084-4aa6-9f1d-845738e47a33/graceful-termination/0.log" Apr 20 10:02:03.858667 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:03.858627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dk6lf_289379f5-7b90-499a-a7cd-14690b1bb4b1/kube-storage-version-migrator-operator/0.log" Apr 20 10:02:07.790398 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.790363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") pod \"image-registry-58f946d5-ggd48\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.790427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.790465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.790503 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.790520 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f946d5-ggd48: secret "image-registry-tls" not found Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.790572 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls podName:402884ec-093d-41f3-93e9-b7964f3d07af nodeName:}" failed. No retries permitted until 2026-04-20 10:02:23.790557051 +0000 UTC m=+70.213618172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls") pod "image-registry-58f946d5-ggd48" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af") : secret "image-registry-tls" not found Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.790585 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 10:02:07.790781 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.790641 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls podName:5cd2cf42-4b4b-4260-963f-fd7f94555d35 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:23.790624206 +0000 UTC m=+70.213685363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-t7cfx" (UID: "5cd2cf42-4b4b-4260-963f-fd7f94555d35") : secret "cluster-monitoring-operator-tls" not found Apr 20 10:02:07.792916 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.792899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e27ef370-a030-44aa-a961-156382685e11-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k822b\" (UID: \"e27ef370-a030-44aa-a961-156382685e11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:02:07.891754 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.891729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:02:07.891877 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.891850 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 10:02:07.891920 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.891901 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert podName:5a97f42a-851b-4803-9be6-3ad666e6f307 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:23.891889059 +0000 UTC m=+70.314950183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7rdz4" (UID: "5a97f42a-851b-4803-9be6-3ad666e6f307") : secret "networking-console-plugin-cert" not found Apr 20 10:02:07.992412 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.992389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:07.992551 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:07.992474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:07.992551 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.992505 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 10:02:07.992688 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.992549 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:02:23.992537532 +0000 UTC m=+70.415598653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : secret "router-metrics-certs-default" not found Apr 20 10:02:07.992688 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:07.992610 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle podName:adc46263-5b99-4162-8415-8a084543bdad nodeName:}" failed. No retries permitted until 2026-04-20 10:02:23.992595666 +0000 UTC m=+70.415656802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle") pod "router-default-7dbc9c6698-7l896" (UID: "adc46263-5b99-4162-8415-8a084543bdad") : configmap references non-existent config key: service-ca.crt Apr 20 10:02:08.084206 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:08.084148 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" Apr 20 10:02:08.199767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:08.199739 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b"] Apr 20 10:02:08.334412 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:08.334347 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" event={"ID":"e27ef370-a030-44aa-a961-156382685e11","Type":"ContainerStarted","Data":"941dc164fa6b056692a7b3dde30d2bd9e8ffed2d298fd3072211cebd353a6d8f"} Apr 20 10:02:10.343304 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:10.343268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" event={"ID":"e27ef370-a030-44aa-a961-156382685e11","Type":"ContainerStarted","Data":"747ac319f495952b3367132da0609b02f0e145e2794a8a060c6ac7ec36f181e9"} Apr 20 10:02:10.343304 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:10.343303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" event={"ID":"e27ef370-a030-44aa-a961-156382685e11","Type":"ContainerStarted","Data":"175d2d59aa8831e928c67c08dc7b0a5ee9fd68021cc168472dd9025e5c1e229d"} Apr 20 10:02:10.361074 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:10.361031 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k822b" podStartSLOduration=17.774744174 podStartE2EDuration="19.361017678s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:02:08.241102915 +0000 UTC m=+54.664164036" lastFinishedPulling="2026-04-20 10:02:09.827376415 +0000 UTC m=+56.250437540" observedRunningTime="2026-04-20 10:02:10.359951009 +0000 UTC m=+56.783012149" watchObservedRunningTime="2026-04-20 10:02:10.361017678 +0000 UTC m=+56.784078820" Apr 20 10:02:13.233343 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:13.233312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bxbxw" Apr 20 10:02:15.078251 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.078221 2577 scope.go:117] "RemoveContainer" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" Apr 20 10:02:15.357646 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.357594 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:02:15.358043 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.358026 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/1.log" Apr 20 10:02:15.358134 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.358064 2577 generic.go:358] "Generic (PLEG): container finished" podID="db045f44-d582-4037-82eb-d656372b093e" containerID="ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0" exitCode=255 Apr 20 10:02:15.358134 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.358117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" event={"ID":"db045f44-d582-4037-82eb-d656372b093e","Type":"ContainerDied","Data":"ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0"} Apr 20 10:02:15.358237 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.358156 2577 scope.go:117] "RemoveContainer" containerID="fa2cf62509ced9de785729107c0933454dac5060cb011e3effa98071924c983a" Apr 20 10:02:15.358444 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:15.358419 2577 scope.go:117] "RemoveContainer" containerID="ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0" Apr 20 10:02:15.358647 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:15.358628 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:02:16.362628 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:16.362595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:02:18.883802 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:18.883762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:02:18.884188 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:18.883871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:18.886274 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:18.886241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2dce1807-1577-4d4f-8a49-740ba99a59ca-metrics-tls\") pod \"dns-default-cqrj9\" (UID: \"2dce1807-1577-4d4f-8a49-740ba99a59ca\") " pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:18.886422 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:18.886404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d8b916-498e-4189-840c-c6931e4b0d70-cert\") pod \"ingress-canary-5fzkf\" (UID: \"84d8b916-498e-4189-840c-c6931e4b0d70\") " pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:02:19.094311 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.094283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qvhbt\"" Apr 20 10:02:19.098814 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.098793 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4xmrs\"" Apr 20 10:02:19.102056 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.102038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:19.106732 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.106709 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5fzkf" Apr 20 10:02:19.260994 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.260917 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqrj9"] Apr 20 10:02:19.263472 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:19.263442 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dce1807_1577_4d4f_8a49_740ba99a59ca.slice/crio-463c99c44f9ff2a51888ba0903e17e87799505305327db12d1bb5ef78b739142 WatchSource:0}: Error finding container 463c99c44f9ff2a51888ba0903e17e87799505305327db12d1bb5ef78b739142: Status 404 returned error can't find the container with id 463c99c44f9ff2a51888ba0903e17e87799505305327db12d1bb5ef78b739142 Apr 20 10:02:19.272818 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.272799 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5fzkf"] Apr 20 10:02:19.274948 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:19.274925 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d8b916_498e_4189_840c_c6931e4b0d70.slice/crio-b469c22c29e64f4930b3e63f28e275f3cb9081e6815cb39b0afd3a8e731f0492 WatchSource:0}: Error finding container b469c22c29e64f4930b3e63f28e275f3cb9081e6815cb39b0afd3a8e731f0492: Status 404 returned error can't find the container with id b469c22c29e64f4930b3e63f28e275f3cb9081e6815cb39b0afd3a8e731f0492 Apr 20 10:02:19.370989 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.370957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5fzkf" event={"ID":"84d8b916-498e-4189-840c-c6931e4b0d70","Type":"ContainerStarted","Data":"b469c22c29e64f4930b3e63f28e275f3cb9081e6815cb39b0afd3a8e731f0492"} Apr 20 10:02:19.371873 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.371851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqrj9" event={"ID":"2dce1807-1577-4d4f-8a49-740ba99a59ca","Type":"ContainerStarted","Data":"463c99c44f9ff2a51888ba0903e17e87799505305327db12d1bb5ef78b739142"} Apr 20 10:02:19.893647 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.893610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:02:19.896396 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:19.896364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a07ac99-a265-4370-a43b-b11246f741de-metrics-certs\") pod \"network-metrics-daemon-vs775\" (UID: \"6a07ac99-a265-4370-a43b-b11246f741de\") " pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:02:20.104727 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.104696 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:02:20.113310 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.112989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs775" Apr 20 10:02:20.210861 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.210833 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58f946d5-ggd48"] Apr 20 10:02:20.212864 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:20.212822 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-58f946d5-ggd48" podUID="402884ec-093d-41f3-93e9-b7964f3d07af" Apr 20 10:02:20.275017 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.274983 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vs775"] Apr 20 10:02:20.284774 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:20.284737 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a07ac99_a265_4370_a43b_b11246f741de.slice/crio-e7d58ed14fabfa8d65283d8bc366b011647b75529faf083a2b78acaa629439b5 WatchSource:0}: Error finding container e7d58ed14fabfa8d65283d8bc366b011647b75529faf083a2b78acaa629439b5: Status 404 returned error can't find the container with id e7d58ed14fabfa8d65283d8bc366b011647b75529faf083a2b78acaa629439b5 Apr 20 10:02:20.337287 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.337264 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-84cjk"] Apr 20 10:02:20.360410 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.360254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.360410 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.360359 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-84cjk"] Apr 20 10:02:20.365252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.363908 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wtn5t\"" Apr 20 10:02:20.365252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.364746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 10:02:20.365252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.365098 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 10:02:20.377048 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.377022 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs775" event={"ID":"6a07ac99-a265-4370-a43b-b11246f741de","Type":"ContainerStarted","Data":"e7d58ed14fabfa8d65283d8bc366b011647b75529faf083a2b78acaa629439b5"} Apr 20 10:02:20.386591 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.386569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:02:20.390976 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.390956 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:02:20.498222 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498154 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498222 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498216 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498297 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfbzd\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498316 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498336 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498393 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498357 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted\") pod \"402884ec-093d-41f3-93e9-b7964f3d07af\" (UID: \"402884ec-093d-41f3-93e9-b7964f3d07af\") " Apr 20 10:02:20.498574 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e2995a2-df23-416f-8edb-670e8832f5ce-data-volume\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.498574 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.498574 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7m5\" (UniqueName: \"kubernetes.io/projected/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-api-access-gj7m5\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.498708 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e2995a2-df23-416f-8edb-670e8832f5ce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.498708 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.498616 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e2995a2-df23-416f-8edb-670e8832f5ce-crio-socket\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.499474 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.499266 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:02:20.499474 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.499376 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:02:20.499474 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.499452 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:02:20.501787 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.501755 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:02:20.501903 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.501856 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:02:20.501993 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.501971 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd" (OuterVolumeSpecName: "kube-api-access-gfbzd") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "kube-api-access-gfbzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:02:20.502935 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.502912 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "402884ec-093d-41f3-93e9-b7964f3d07af" (UID: "402884ec-093d-41f3-93e9-b7964f3d07af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:02:20.599859 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.599828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e2995a2-df23-416f-8edb-670e8832f5ce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600014 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.599868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e2995a2-df23-416f-8edb-670e8832f5ce-crio-socket\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600014 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.599965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e2995a2-df23-416f-8edb-670e8832f5ce-crio-socket\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600186 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e2995a2-df23-416f-8edb-670e8832f5ce-data-volume\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600186 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600186 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7m5\" (UniqueName: \"kubernetes.io/projected/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-api-access-gj7m5\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600261 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-image-registry-private-configuration\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600280 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-registry-certificates\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600295 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfbzd\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-kube-api-access-gfbzd\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600309 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-bound-sa-token\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600323 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/402884ec-093d-41f3-93e9-b7964f3d07af-installation-pull-secrets\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600548 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600337 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/402884ec-093d-41f3-93e9-b7964f3d07af-ca-trust-extracted\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600548 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600354 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/402884ec-093d-41f3-93e9-b7964f3d07af-trusted-ca\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:20.600548 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e2995a2-df23-416f-8edb-670e8832f5ce-data-volume\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.600762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.600737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.602503 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.602481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e2995a2-df23-416f-8edb-670e8832f5ce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.610827 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.610799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7m5\" (UniqueName: \"kubernetes.io/projected/8e2995a2-df23-416f-8edb-670e8832f5ce-kube-api-access-gj7m5\") pod \"insights-runtime-extractor-84cjk\" (UID: \"8e2995a2-df23-416f-8edb-670e8832f5ce\") " pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:20.676277 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:20.676253 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-84cjk" Apr 20 10:02:21.380164 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:21.380130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f946d5-ggd48" Apr 20 10:02:21.415816 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:21.415789 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58f946d5-ggd48"] Apr 20 10:02:21.417213 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:21.417190 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-58f946d5-ggd48"] Apr 20 10:02:21.607335 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:21.607307 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/402884ec-093d-41f3-93e9-b7964f3d07af-registry-tls\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:02:22.063128 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.063095 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:22.063230 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.063145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:22.063603 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.063570 2577 scope.go:117] "RemoveContainer" containerID="ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0" Apr 20 10:02:22.063887 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:22.063810 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:02:22.084012 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.083796 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402884ec-093d-41f3-93e9-b7964f3d07af" path="/var/lib/kubelet/pods/402884ec-093d-41f3-93e9-b7964f3d07af/volumes" Apr 20 10:02:22.131796 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.131771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-84cjk"] Apr 20 10:02:22.134683 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:22.134639 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2995a2_df23_416f_8edb_670e8832f5ce.slice/crio-6ee3836a3fd5517d7a47eb315baf202628265ca1f961cbe00242535079f222bf WatchSource:0}: Error finding container 6ee3836a3fd5517d7a47eb315baf202628265ca1f961cbe00242535079f222bf: Status 404 returned error can't find the container with id 6ee3836a3fd5517d7a47eb315baf202628265ca1f961cbe00242535079f222bf Apr 20 10:02:22.386539 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.386503 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5fzkf" event={"ID":"84d8b916-498e-4189-840c-c6931e4b0d70","Type":"ContainerStarted","Data":"b67709bd4ac108657031834ece29c19e627715b65968a0a143abe585f09c00d4"} Apr 20 10:02:22.388367 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.388346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqrj9" event={"ID":"2dce1807-1577-4d4f-8a49-740ba99a59ca","Type":"ContainerStarted","Data":"3af9d0aa4349a55ac6415ac6332955f85876510bd77b3d41db9e2e9853046f46"} Apr 20 10:02:22.391398 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.391372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84cjk" event={"ID":"8e2995a2-df23-416f-8edb-670e8832f5ce","Type":"ContainerStarted","Data":"d28b30709b511ceb0d9c74b61f7de93b7ff3feaf999e7ef8be76015e736f05a5"} Apr 20 10:02:22.391493 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.391409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84cjk" event={"ID":"8e2995a2-df23-416f-8edb-670e8832f5ce","Type":"ContainerStarted","Data":"6ee3836a3fd5517d7a47eb315baf202628265ca1f961cbe00242535079f222bf"} Apr 20 10:02:22.404105 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:22.403758 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5fzkf" podStartSLOduration=33.669619008 podStartE2EDuration="36.403741515s" podCreationTimestamp="2026-04-20 10:01:46 +0000 UTC" firstStartedPulling="2026-04-20 10:02:19.276632824 +0000 UTC m=+65.699693944" lastFinishedPulling="2026-04-20 10:02:22.01075533 +0000 UTC m=+68.433816451" observedRunningTime="2026-04-20 10:02:22.40247826 +0000 UTC m=+68.825539426" watchObservedRunningTime="2026-04-20 10:02:22.403741515 +0000 UTC m=+68.826802663" Apr 20 10:02:23.279311 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.279291 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gm5vg" Apr 20 10:02:23.395265 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.395236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqrj9" event={"ID":"2dce1807-1577-4d4f-8a49-740ba99a59ca","Type":"ContainerStarted","Data":"d3de255eacb0f9559645500fa60bb4efa452a420d524d57eadf072808d5dc999"} Apr 20 10:02:23.395611 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.395363 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:23.397092 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.397062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84cjk" event={"ID":"8e2995a2-df23-416f-8edb-670e8832f5ce","Type":"ContainerStarted","Data":"e659196ba9e0a6aab6e35798487f137d1e2c4272139705c7a7d7f8f035fb99d7"} Apr 20 10:02:23.398573 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.398549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs775" event={"ID":"6a07ac99-a265-4370-a43b-b11246f741de","Type":"ContainerStarted","Data":"04797a28ee95d15ba09b52fe225affde2b7118f1ddbd2dedb1c3c1aeea871eb2"} Apr 20 10:02:23.398701 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.398579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs775" event={"ID":"6a07ac99-a265-4370-a43b-b11246f741de","Type":"ContainerStarted","Data":"396329f3efad3d5ddac22d19286808d99559dc3c8fe99d370f56d3aabacd7625"} Apr 20 10:02:23.420412 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.420372 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cqrj9" podStartSLOduration=34.68043195 podStartE2EDuration="37.420360364s" podCreationTimestamp="2026-04-20 10:01:46 +0000 UTC" firstStartedPulling="2026-04-20 10:02:19.26541339 +0000 UTC m=+65.688474511" lastFinishedPulling="2026-04-20 10:02:22.005341802 +0000 UTC m=+68.428402925" observedRunningTime="2026-04-20 10:02:23.418926222 +0000 UTC m=+69.841987376" watchObservedRunningTime="2026-04-20 10:02:23.420360364 +0000 UTC m=+69.843421508" Apr 20 10:02:23.438865 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.438833 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vs775" podStartSLOduration=67.341377311 podStartE2EDuration="1m9.438817191s" podCreationTimestamp="2026-04-20 10:01:14 +0000 UTC" firstStartedPulling="2026-04-20 10:02:20.287343891 +0000 UTC m=+66.710405025" lastFinishedPulling="2026-04-20 10:02:22.384783784 +0000 UTC m=+68.807844905" observedRunningTime="2026-04-20 10:02:23.438005055 +0000 UTC m=+69.861066199" watchObservedRunningTime="2026-04-20 10:02:23.438817191 +0000 UTC m=+69.861878335" Apr 20 10:02:23.824626 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.824592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:02:23.827517 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.827495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cd2cf42-4b4b-4260-963f-fd7f94555d35-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-t7cfx\" (UID: \"5cd2cf42-4b4b-4260-963f-fd7f94555d35\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:02:23.925348 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.925319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:02:23.927910 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.927888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a97f42a-851b-4803-9be6-3ad666e6f307-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7rdz4\" (UID: \"5a97f42a-851b-4803-9be6-3ad666e6f307\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:02:23.981585 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.981564 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mn8pk\"" Apr 20 10:02:23.989111 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:23.989089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" Apr 20 10:02:24.025730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.025692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:24.025846 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.025769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:24.026283 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.026262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc46263-5b99-4162-8415-8a084543bdad-service-ca-bundle\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:24.028096 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.028075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adc46263-5b99-4162-8415-8a084543bdad-metrics-certs\") pod \"router-default-7dbc9c6698-7l896\" (UID: \"adc46263-5b99-4162-8415-8a084543bdad\") " pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:24.121620 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.121591 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hf5zq\"" Apr 20 10:02:24.129724 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.129701 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" Apr 20 10:02:24.135060 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.135039 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx"] Apr 20 10:02:24.137139 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:24.137118 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd2cf42_4b4b_4260_963f_fd7f94555d35.slice/crio-b7276a7173d977910cb9fb814103609de3579b3995b9c303b6a896dd3d935c02 WatchSource:0}: Error finding container b7276a7173d977910cb9fb814103609de3579b3995b9c303b6a896dd3d935c02: Status 404 returned error can't find the container with id b7276a7173d977910cb9fb814103609de3579b3995b9c303b6a896dd3d935c02 Apr 20 10:02:24.147163 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.147142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rzqv4\"" Apr 20 10:02:24.155281 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.155256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:24.261133 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.261104 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4"] Apr 20 10:02:24.402390 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.402342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" event={"ID":"5cd2cf42-4b4b-4260-963f-fd7f94555d35","Type":"ContainerStarted","Data":"b7276a7173d977910cb9fb814103609de3579b3995b9c303b6a896dd3d935c02"} Apr 20 10:02:24.572608 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:24.572571 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a97f42a_851b_4803_9be6_3ad666e6f307.slice/crio-548b65c531d325c977605a64b8ea6408512dc013c9e0928d6da0cb0ef99e415d WatchSource:0}: Error finding container 548b65c531d325c977605a64b8ea6408512dc013c9e0928d6da0cb0ef99e415d: Status 404 returned error can't find the container with id 548b65c531d325c977605a64b8ea6408512dc013c9e0928d6da0cb0ef99e415d Apr 20 10:02:24.693028 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:24.692970 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dbc9c6698-7l896"] Apr 20 10:02:24.694876 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:24.694853 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc46263_5b99_4162_8415_8a084543bdad.slice/crio-64725ebd83dc53ec8a2d44a1979b6e7539a5dc83cdca316db72948e172f3b7c4 WatchSource:0}: Error finding container 64725ebd83dc53ec8a2d44a1979b6e7539a5dc83cdca316db72948e172f3b7c4: Status 404 returned error can't find the container with id 64725ebd83dc53ec8a2d44a1979b6e7539a5dc83cdca316db72948e172f3b7c4 Apr 20 10:02:25.408798 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.408761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-84cjk" event={"ID":"8e2995a2-df23-416f-8edb-670e8832f5ce","Type":"ContainerStarted","Data":"a9af3c2c9ac4edfae53354d29c5d565cb83419b0a791c89947eefe8b9d614e35"} Apr 20 10:02:25.411037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.410572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dbc9c6698-7l896" event={"ID":"adc46263-5b99-4162-8415-8a084543bdad","Type":"ContainerStarted","Data":"b76be7133755cc8e59c65da5f13641866083d24880a5071d948fd2d6e24f2870"} Apr 20 10:02:25.411037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.410606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dbc9c6698-7l896" event={"ID":"adc46263-5b99-4162-8415-8a084543bdad","Type":"ContainerStarted","Data":"64725ebd83dc53ec8a2d44a1979b6e7539a5dc83cdca316db72948e172f3b7c4"} Apr 20 10:02:25.411788 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.411765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" event={"ID":"5a97f42a-851b-4803-9be6-3ad666e6f307","Type":"ContainerStarted","Data":"548b65c531d325c977605a64b8ea6408512dc013c9e0928d6da0cb0ef99e415d"} Apr 20 10:02:25.430536 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.430497 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-84cjk" podStartSLOduration=3.065931038 podStartE2EDuration="5.43048509s" podCreationTimestamp="2026-04-20 10:02:20 +0000 UTC" firstStartedPulling="2026-04-20 10:02:22.255183602 +0000 UTC m=+68.678244723" lastFinishedPulling="2026-04-20 10:02:24.619737651 +0000 UTC m=+71.042798775" observedRunningTime="2026-04-20 10:02:25.428211159 +0000 UTC m=+71.851272302" watchObservedRunningTime="2026-04-20 10:02:25.43048509 +0000 UTC m=+71.853546232" Apr 20 10:02:25.452389 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:25.452352 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dbc9c6698-7l896" podStartSLOduration=34.452340194 podStartE2EDuration="34.452340194s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:02:25.450091196 +0000 UTC m=+71.873152364" watchObservedRunningTime="2026-04-20 10:02:25.452340194 +0000 UTC m=+71.875401337" Apr 20 10:02:26.155913 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.155673 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:26.158141 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.158121 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:26.416295 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.416265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" event={"ID":"5a97f42a-851b-4803-9be6-3ad666e6f307","Type":"ContainerStarted","Data":"12a9d0352cb3c001be257c1401f0e8338bed2170b7fb0cdc2db41f26a80a1333"} Apr 20 10:02:26.417725 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.417693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" event={"ID":"5cd2cf42-4b4b-4260-963f-fd7f94555d35","Type":"ContainerStarted","Data":"fc1fe49d79006d8cde57622207bcdb1dc2dc4845e916d8942499f0022476ed8e"} Apr 20 10:02:26.418024 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.418003 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:26.419036 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.419020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dbc9c6698-7l896" Apr 20 10:02:26.442267 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.442226 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7rdz4" podStartSLOduration=33.979909006 podStartE2EDuration="35.442214809s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:02:24.574956344 +0000 UTC m=+70.998017483" lastFinishedPulling="2026-04-20 10:02:26.03726215 +0000 UTC m=+72.460323286" observedRunningTime="2026-04-20 10:02:26.441074532 +0000 UTC m=+72.864135675" watchObservedRunningTime="2026-04-20 10:02:26.442214809 +0000 UTC m=+72.865275952" Apr 20 10:02:26.505677 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:26.505611 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-t7cfx" podStartSLOduration=33.603454539 podStartE2EDuration="35.50559737s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:02:24.138861233 +0000 UTC m=+70.561922358" lastFinishedPulling="2026-04-20 10:02:26.041004064 +0000 UTC m=+72.464065189" observedRunningTime="2026-04-20 10:02:26.503538109 +0000 UTC m=+72.926599252" watchObservedRunningTime="2026-04-20 10:02:26.50559737 +0000 UTC m=+72.928658513" Apr 20 10:02:33.078374 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:33.078258 2577 scope.go:117] "RemoveContainer" containerID="ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0" Apr 20 10:02:33.078769 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:33.078435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2zgz2_openshift-console-operator(db045f44-d582-4037-82eb-d656372b093e)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podUID="db045f44-d582-4037-82eb-d656372b093e" Apr 20 10:02:33.404030 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:33.404006 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cqrj9" Apr 20 10:02:35.069380 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.069343 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n4j86"] Apr 20 10:02:35.105784 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.105756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.108555 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.108521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 10:02:35.108681 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.108617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gq8bt\"" Apr 20 10:02:35.109769 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.109750 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 10:02:35.109859 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.109831 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 10:02:35.109914 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.109897 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 10:02:35.198351 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198467 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198467 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-root\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198467 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-textfile\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198467 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-metrics-client-ca\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198467 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-sys\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198635 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmff4\" (UniqueName: \"kubernetes.io/projected/cb15f011-6dbf-43b9-8367-0979ca21cb28-kube-api-access-gmff4\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198635 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.198635 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.198577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-wtmp\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-root\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-textfile\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-metrics-client-ca\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-sys\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmff4\" (UniqueName: \"kubernetes.io/projected/cb15f011-6dbf-43b9-8367-0979ca21cb28-kube-api-access-gmff4\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-root\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-sys\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:35.299423 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:02:35.299478 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls podName:cb15f011-6dbf-43b9-8367-0979ca21cb28 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:35.799458633 +0000 UTC m=+82.222519756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls") pod "node-exporter-n4j86" (UID: "cb15f011-6dbf-43b9-8367-0979ca21cb28") : secret "node-exporter-tls" not found Apr 20 10:02:35.299564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-wtmp\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299926 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-textfile\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299926 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-wtmp\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.299926 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.300039 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.299921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb15f011-6dbf-43b9-8367-0979ca21cb28-metrics-client-ca\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.301729 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.301695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.308365 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.308347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmff4\" (UniqueName: \"kubernetes.io/projected/cb15f011-6dbf-43b9-8367-0979ca21cb28-kube-api-access-gmff4\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.803593 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.803561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:35.805948 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:35.805924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb15f011-6dbf-43b9-8367-0979ca21cb28-node-exporter-tls\") pod \"node-exporter-n4j86\" (UID: \"cb15f011-6dbf-43b9-8367-0979ca21cb28\") " pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:36.015139 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:36.015113 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n4j86" Apr 20 10:02:36.023892 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:36.023869 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb15f011_6dbf_43b9_8367_0979ca21cb28.slice/crio-143f28d2f5747c47f00477e92a5426fdb01fa5d8f2b0c55bd2e793221ebbfecd WatchSource:0}: Error finding container 143f28d2f5747c47f00477e92a5426fdb01fa5d8f2b0c55bd2e793221ebbfecd: Status 404 returned error can't find the container with id 143f28d2f5747c47f00477e92a5426fdb01fa5d8f2b0c55bd2e793221ebbfecd Apr 20 10:02:36.447806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:36.447765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4j86" event={"ID":"cb15f011-6dbf-43b9-8367-0979ca21cb28","Type":"ContainerStarted","Data":"143f28d2f5747c47f00477e92a5426fdb01fa5d8f2b0c55bd2e793221ebbfecd"} Apr 20 10:02:37.451782 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:37.451749 2577 generic.go:358] "Generic (PLEG): container finished" podID="cb15f011-6dbf-43b9-8367-0979ca21cb28" containerID="0856bbea8f0294a89be69ca771446175b765a63a6bb74d43e8a64e24474f4740" exitCode=0 Apr 20 10:02:37.452164 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:37.451816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4j86" event={"ID":"cb15f011-6dbf-43b9-8367-0979ca21cb28","Type":"ContainerDied","Data":"0856bbea8f0294a89be69ca771446175b765a63a6bb74d43e8a64e24474f4740"} Apr 20 10:02:38.456509 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:38.456470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4j86" event={"ID":"cb15f011-6dbf-43b9-8367-0979ca21cb28","Type":"ContainerStarted","Data":"798440fa051ef15ecd09bb818e5dc44b5685c6c206e75a1e6269db472af601ad"} Apr 20 10:02:38.456509 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:38.456510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4j86" event={"ID":"cb15f011-6dbf-43b9-8367-0979ca21cb28","Type":"ContainerStarted","Data":"1035e287ee5ffc31676d3c39ae39509cdfc5253ee2e06d4c19aef6513fe63d3e"} Apr 20 10:02:38.479708 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:38.479641 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n4j86" podStartSLOduration=2.633209678 podStartE2EDuration="3.479628849s" podCreationTimestamp="2026-04-20 10:02:35 +0000 UTC" firstStartedPulling="2026-04-20 10:02:36.02548026 +0000 UTC m=+82.448541382" lastFinishedPulling="2026-04-20 10:02:36.871899428 +0000 UTC m=+83.294960553" observedRunningTime="2026-04-20 10:02:38.478104354 +0000 UTC m=+84.901165525" watchObservedRunningTime="2026-04-20 10:02:38.479628849 +0000 UTC m=+84.902689992" Apr 20 10:02:41.290330 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.290296 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:02:41.296710 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.296692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.299602 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.299582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 10:02:41.299807 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.299785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 10:02:41.299878 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.299788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 10:02:41.299878 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.299824 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6cl30p8e0g44f\"" Apr 20 10:02:41.299878 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.299825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 10:02:41.300231 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300217 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 10:02:41.300274 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 10:02:41.300881 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300860 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 10:02:41.300881 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300870 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 10:02:41.301052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300887 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 10:02:41.301052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300895 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 10:02:41.301052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300965 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 10:02:41.301052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300997 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 10:02:41.301052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.300998 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-w9gpt\"" Apr 20 10:02:41.303153 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.303131 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 10:02:41.306510 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.306489 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:02:41.340037 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340144 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340144 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340144 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340307 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340307 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340307 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340307 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340307 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbcdz\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340488 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340687 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340687 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.340687 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.340559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441792 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.441979 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.441910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbcdz\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442286 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442907 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442907 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442907 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442907 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.442907 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.442801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.446185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.446340 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.447007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.447007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.447007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.447007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.447007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.446957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.449045 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.447598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.449045 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.447922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.451718 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.449279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.451718 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.449442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.452950 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.452192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbcdz\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.453704 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.453684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.453855 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.453836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.454011 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.453963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.607329 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.607275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:02:41.739683 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:41.739632 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:02:41.743605 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:02:41.743566 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda558aa93_21b2_497e_bbaa_ac2985f6f656.slice/crio-7bbd1400d646689caeb99364966673d9f245dd4f9cfa5c292c9ec09a98ba66b1 WatchSource:0}: Error finding container 7bbd1400d646689caeb99364966673d9f245dd4f9cfa5c292c9ec09a98ba66b1: Status 404 returned error can't find the container with id 7bbd1400d646689caeb99364966673d9f245dd4f9cfa5c292c9ec09a98ba66b1 Apr 20 10:02:42.468725 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:42.468691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"7bbd1400d646689caeb99364966673d9f245dd4f9cfa5c292c9ec09a98ba66b1"} Apr 20 10:02:43.473410 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:43.473372 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" exitCode=0 Apr 20 10:02:43.473861 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:43.473467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} Apr 20 10:02:45.078166 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.078135 2577 scope.go:117] "RemoveContainer" containerID="ed55052e16d220b89b6a1e9c3fc7773cd14d15d248c1b97c88abb870d7cc44a0" Apr 20 10:02:45.484489 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.484449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:02:45.484730 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.484515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" event={"ID":"db045f44-d582-4037-82eb-d656372b093e","Type":"ContainerStarted","Data":"6156d56e5e9da118e7cd928d49432d8ff2d9dbe5c48098a106641a1e1abbc581"} Apr 20 10:02:45.485423 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.485045 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:45.504237 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.504187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" podStartSLOduration=49.509824914 podStartE2EDuration="54.504173041s" podCreationTimestamp="2026-04-20 10:01:51 +0000 UTC" firstStartedPulling="2026-04-20 10:01:52.22083844 +0000 UTC m=+38.643899582" lastFinishedPulling="2026-04-20 10:01:57.215186578 +0000 UTC m=+43.638247709" observedRunningTime="2026-04-20 10:02:45.502317422 +0000 UTC m=+91.925378572" watchObservedRunningTime="2026-04-20 10:02:45.504173041 +0000 UTC m=+91.927234184" Apr 20 10:02:45.728958 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:45.728922 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2zgz2" Apr 20 10:02:46.489823 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:46.489791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} Apr 20 10:02:47.497676 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:47.497630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} Apr 20 10:02:49.507534 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:49.507504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} Apr 20 10:02:49.507534 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:49.507539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} Apr 20 10:02:49.507954 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:49.507550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} Apr 20 10:02:49.507954 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:49.507559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerStarted","Data":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} Apr 20 10:02:49.539461 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:49.539406 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7594040770000001 podStartE2EDuration="8.539387998s" podCreationTimestamp="2026-04-20 10:02:41 +0000 UTC" firstStartedPulling="2026-04-20 10:02:41.745585841 +0000 UTC m=+88.168646962" lastFinishedPulling="2026-04-20 10:02:48.525569759 +0000 UTC m=+94.948630883" observedRunningTime="2026-04-20 10:02:49.536559448 +0000 UTC m=+95.959620591" watchObservedRunningTime="2026-04-20 10:02:49.539387998 +0000 UTC m=+95.962449142" Apr 20 10:02:51.607862 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:02:51.607824 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:03:03.550478 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:03.550443 2577 generic.go:358] "Generic (PLEG): container finished" podID="81c6d571-c228-4b53-8d5e-c96359b3d8f6" containerID="15a46ca0dbfee901d60bed390d5aee7d63535e793baf872b34188aebf3181b90" exitCode=0 Apr 20 10:03:03.550903 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:03.550525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" event={"ID":"81c6d571-c228-4b53-8d5e-c96359b3d8f6","Type":"ContainerDied","Data":"15a46ca0dbfee901d60bed390d5aee7d63535e793baf872b34188aebf3181b90"} Apr 20 10:03:03.550903 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:03.550845 2577 scope.go:117] "RemoveContainer" containerID="15a46ca0dbfee901d60bed390d5aee7d63535e793baf872b34188aebf3181b90" Apr 20 10:03:04.554772 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:04.554733 2577 generic.go:358] "Generic (PLEG): container finished" podID="9cedfb03-6b25-46db-a934-2933c2d42473" containerID="4172cbda1740a915afb9a5341737b6bf261360f5ed99ef70ecfba4be80130fe7" exitCode=0 Apr 20 10:03:04.555195 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:04.554808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" event={"ID":"9cedfb03-6b25-46db-a934-2933c2d42473","Type":"ContainerDied","Data":"4172cbda1740a915afb9a5341737b6bf261360f5ed99ef70ecfba4be80130fe7"} Apr 20 10:03:04.555195 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:04.555151 2577 scope.go:117] "RemoveContainer" containerID="4172cbda1740a915afb9a5341737b6bf261360f5ed99ef70ecfba4be80130fe7" Apr 20 10:03:04.556694 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:04.556559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-79qp4" event={"ID":"81c6d571-c228-4b53-8d5e-c96359b3d8f6","Type":"ContainerStarted","Data":"2be662702f18a5f63157213708f536072c32d8ecb1c0c5f90259958b4c15cc24"} Apr 20 10:03:05.283598 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:05.283568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dbc9c6698-7l896_adc46263-5b99-4162-8415-8a084543bdad/router/0.log" Apr 20 10:03:05.288492 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:05.288468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5fzkf_84d8b916-498e-4189-840c-c6931e4b0d70/serve-healthcheck-canary/0.log" Apr 20 10:03:05.563821 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:05.563727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5fn5" event={"ID":"9cedfb03-6b25-46db-a934-2933c2d42473","Type":"ContainerStarted","Data":"553466d610c3cc24d49d820cc789973956f69504037823b63a236f1072e77613"} Apr 20 10:03:13.589077 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:13.589035 2577 generic.go:358] "Generic (PLEG): container finished" podID="289379f5-7b90-499a-a7cd-14690b1bb4b1" containerID="4ed22101c1a8516f350a2206bd7f405166fa65e4e68c269a878b88fec7116caa" exitCode=0 Apr 20 10:03:13.589493 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:13.589113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" event={"ID":"289379f5-7b90-499a-a7cd-14690b1bb4b1","Type":"ContainerDied","Data":"4ed22101c1a8516f350a2206bd7f405166fa65e4e68c269a878b88fec7116caa"} Apr 20 10:03:13.589493 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:13.589455 2577 scope.go:117] "RemoveContainer" containerID="4ed22101c1a8516f350a2206bd7f405166fa65e4e68c269a878b88fec7116caa" Apr 20 10:03:14.593726 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:14.593688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dk6lf" event={"ID":"289379f5-7b90-499a-a7cd-14690b1bb4b1","Type":"ContainerStarted","Data":"d3ae8f239af9b2d0d42bf395e64ef2a183ca559b9dfcc0acce63718738d01696"} Apr 20 10:03:41.607760 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:41.607721 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:03:41.623047 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:41.623022 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:03:41.694261 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:41.694230 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:03:59.754939 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.754906 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:03:59.755441 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755356 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="prometheus" containerID="cri-o://2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" gracePeriod=600 Apr 20 10:03:59.755441 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755379 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy" containerID="cri-o://fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" gracePeriod=600 Apr 20 10:03:59.755441 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755413 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-thanos" containerID="cri-o://634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" gracePeriod=600 Apr 20 10:03:59.755618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755444 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="thanos-sidecar" containerID="cri-o://c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" gracePeriod=600 Apr 20 10:03:59.755618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755453 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="config-reloader" containerID="cri-o://db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" gracePeriod=600 Apr 20 10:03:59.755618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.755501 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-web" containerID="cri-o://39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" gracePeriod=600 Apr 20 10:03:59.990599 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:03:59.990577 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.114268 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114246 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbcdz\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114284 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114303 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114326 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114352 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114386 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114690 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114415 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114690 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114454 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114690 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114489 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.114898 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114857 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:00.115030 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.114982 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:00.115516 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115250 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115516 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115348 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115516 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115376 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115516 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115405 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115548 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115599 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115695 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.115806 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115751 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a558aa93-21b2-497e-bbaa-ac2985f6f656\" (UID: \"a558aa93-21b2-497e-bbaa-ac2985f6f656\") " Apr 20 10:04:00.116040 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115981 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.116040 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.115997 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.116760 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.116562 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:04:00.117044 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.116622 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:00.117248 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.117219 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:00.117332 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.117296 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz" (OuterVolumeSpecName: "kube-api-access-xbcdz") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "kube-api-access-xbcdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:04:00.117394 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.117364 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.117542 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.117506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:00.117744 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.117706 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:04:00.118432 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.118404 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.119052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.119023 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config" (OuterVolumeSpecName: "config") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.119545 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.119520 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.120001 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.119976 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.120101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.120005 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.120101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.120073 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out" (OuterVolumeSpecName: "config-out") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:04:00.120185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.120157 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.120185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.120168 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.129786 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.129763 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config" (OuterVolumeSpecName: "web-config") pod "a558aa93-21b2-497e-bbaa-ac2985f6f656" (UID: "a558aa93-21b2-497e-bbaa-ac2985f6f656"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:00.216906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216884 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-tls-assets\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.216906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216905 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-kube-rbac-proxy\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216915 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216926 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216936 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-db\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216944 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216954 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-configmap-metrics-client-ca\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216963 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-config\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216972 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216981 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-metrics-client-certs\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216989 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a558aa93-21b2-497e-bbaa-ac2985f6f656-config-out\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.216998 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-web-config\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.217007 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.217015 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbcdz\" (UniqueName: \"kubernetes.io/projected/a558aa93-21b2-497e-bbaa-ac2985f6f656-kube-api-access-xbcdz\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217019 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.217024 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a558aa93-21b2-497e-bbaa-ac2985f6f656-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.217372 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.217033 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558aa93-21b2-497e-bbaa-ac2985f6f656-secret-grpc-tls\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.737972 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" exitCode=0 Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.737999 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" exitCode=0 Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738005 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" exitCode=0 Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738012 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" exitCode=0 Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738018 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" exitCode=0 Apr 20 10:04:00.738059 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738023 2577 generic.go:358] "Generic (PLEG): container finished" podID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" exitCode=0 Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738087 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a558aa93-21b2-497e-bbaa-ac2985f6f656","Type":"ContainerDied","Data":"7bbd1400d646689caeb99364966673d9f245dd4f9cfa5c292c9ec09a98ba66b1"} Apr 20 10:04:00.738409 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.738180 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.748941 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.748876 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.756670 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.756478 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.762830 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.762811 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.768648 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.768627 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:04:00.774767 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.774743 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:04:00.783433 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.783417 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.789671 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.789641 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.796229 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.796213 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.802199 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802184 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.802444 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.802422 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.802512 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802454 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.802512 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802497 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.802728 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.802709 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.802764 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802734 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.802764 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802757 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.802989 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.802971 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.803032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.802994 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.803032 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803009 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.803205 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.803189 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.803263 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803212 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.803263 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803231 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.803446 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.803429 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.803483 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803450 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.803483 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803467 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.803630 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.803614 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.803700 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803633 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.803700 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803646 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.803945 ip-10-0-140-95 kubenswrapper[2577]: E0420 10:04:00.803924 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.803995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803947 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.803995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.803961 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.804211 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804192 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.804255 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804212 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.804426 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804406 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.804503 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804427 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.804633 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804617 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.804710 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804633 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.804844 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804828 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.804888 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.804844 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.805042 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805025 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.805088 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805044 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.805207 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805192 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.805249 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805206 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.805396 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805379 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.805489 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805397 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.805565 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805550 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.805605 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805565 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.805763 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805744 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.805809 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805764 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.805916 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805901 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.805956 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.805915 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.806077 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806061 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.806116 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806077 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.806231 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806215 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.806272 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806231 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.806433 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806415 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.806486 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806433 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.806604 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806588 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.806645 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806605 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.806807 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806791 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.806845 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.806808 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.807020 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807007 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.807056 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807021 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.807192 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807176 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.807230 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807193 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.807362 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807349 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.807405 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807363 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.807521 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807506 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.807573 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807526 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.807727 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807713 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.807776 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807728 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.807916 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807897 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.807916 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.807914 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.808104 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808079 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.808104 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808103 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.808376 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808346 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.808376 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808369 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.808639 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808617 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.808745 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808641 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.809052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808912 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.809052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.808937 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.809290 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809262 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.809290 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809287 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.809518 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809499 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.809581 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809520 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.809871 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809851 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.809954 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809873 2577 scope.go:117] "RemoveContainer" containerID="634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc" Apr 20 10:04:00.810008 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.809949 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:04:00.810127 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810105 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc"} err="failed to get container status \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": rpc error: code = NotFound desc = could not find container \"634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc\": container with ID starting with 634ac5ea27fe76c9d5cb05a96485acc7eadb6e7ba7178282ee9291b6292d15fc not found: ID does not exist" Apr 20 10:04:00.810178 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810129 2577 scope.go:117] "RemoveContainer" containerID="fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef" Apr 20 10:04:00.810312 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810299 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="init-config-reloader" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810315 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="init-config-reloader" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810328 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-web" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810335 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-web" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810346 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="config-reloader" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810354 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="config-reloader" Apr 20 10:04:00.810369 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810367 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810373 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810370 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef"} err="failed to get container status \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": rpc error: code = NotFound desc = could not find container \"fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef\": container with ID starting with fb22aabce670652681ccfe80079a5b655f0249088a2cebe498263ae346ba46ef not found: ID does not exist" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810381 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-thanos" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810387 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-thanos" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810399 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="prometheus" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810407 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="prometheus" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810415 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="thanos-sidecar" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810420 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="thanos-sidecar" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810386 2577 scope.go:117] "RemoveContainer" containerID="39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810471 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="prometheus" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810484 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-web" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810492 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy-thanos" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810500 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="config-reloader" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810510 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="kube-rbac-proxy" Apr 20 10:04:00.810541 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810516 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" containerName="thanos-sidecar" Apr 20 10:04:00.811057 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810702 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402"} err="failed to get container status \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": rpc error: code = NotFound desc = could not find container \"39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402\": container with ID starting with 39a08ca2845d22f1eca4322eaa7effeae658b788cd3e394e46f543b410259402 not found: ID does not exist" Apr 20 10:04:00.811057 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810719 2577 scope.go:117] "RemoveContainer" containerID="c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e" Apr 20 10:04:00.811057 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810917 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e"} err="failed to get container status \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": rpc error: code = NotFound desc = could not find container \"c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e\": container with ID starting with c69a0f5202dd84e49522c05557b841ee1d61e3e9bdc796ff4e6952e4c0c6c60e not found: ID does not exist" Apr 20 10:04:00.811057 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.810931 2577 scope.go:117] "RemoveContainer" containerID="db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0" Apr 20 10:04:00.811194 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.811083 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0"} err="failed to get container status \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": rpc error: code = NotFound desc = could not find container \"db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0\": container with ID starting with db12c158d0accd07628c94becf49cfeb99918b5981f4397e5fcf32766860bce0 not found: ID does not exist" Apr 20 10:04:00.811194 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.811105 2577 scope.go:117] "RemoveContainer" containerID="2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda" Apr 20 10:04:00.811317 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.811299 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda"} err="failed to get container status \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": rpc error: code = NotFound desc = could not find container \"2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda\": container with ID starting with 2efce115fa9f67e8b56e18c01acd6d316f4460bba57feec3397f4713ce8a5eda not found: ID does not exist" Apr 20 10:04:00.811360 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.811319 2577 scope.go:117] "RemoveContainer" containerID="82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d" Apr 20 10:04:00.811535 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.811518 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d"} err="failed to get container status \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": rpc error: code = NotFound desc = could not find container \"82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d\": container with ID starting with 82968b5ad9307f9ad0be07fdb586e3d9eb74f7860defa80d4e6e58b66fdd3f8d not found: ID does not exist" Apr 20 10:04:00.815846 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.815832 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.818984 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.818967 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-w9gpt\"" Apr 20 10:04:00.819134 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819118 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 10:04:00.819257 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 10:04:00.819311 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819282 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 10:04:00.819356 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819312 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 10:04:00.819720 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 10:04:00.819804 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819705 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 10:04:00.819804 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819790 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 10:04:00.819892 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.819806 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6cl30p8e0g44f\"" Apr 20 10:04:00.820340 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.820323 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 10:04:00.820466 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.820350 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 10:04:00.820533 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.820522 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 10:04:00.822474 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.822400 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 10:04:00.824334 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.824318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 10:04:00.826703 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.826688 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 10:04:00.832862 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.832836 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:04:00.922533 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-web-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtcz\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-kube-api-access-dxtcz\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-config-out\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922618 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922815 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922815 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922815 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922887 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.922995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.922993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.923101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.923011 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:00.923101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:00.923031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024141 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024141 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024253 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024253 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024253 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024253 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024430 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-web-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxtcz\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-kube-api-access-dxtcz\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.024652 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.024593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-config-out\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.025833 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.025410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.028534 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.028209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.028534 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.028317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.028723 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.028567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.028807 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.028787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-web-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.032106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.029169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.032106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.029827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.032106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.030686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.032106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.030894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.032106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.031002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/127e7995-5b41-480d-af98-90e6ff792104-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.034917 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.034890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-config\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.035007 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.034940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.035074 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.035008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.035162 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.035138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/127e7995-5b41-480d-af98-90e6ff792104-config-out\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.035346 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.035321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/127e7995-5b41-480d-af98-90e6ff792104-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.037640 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.037614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxtcz\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-kube-api-access-dxtcz\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.037828 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.037812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/127e7995-5b41-480d-af98-90e6ff792104-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"127e7995-5b41-480d-af98-90e6ff792104\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.125162 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.125124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:04:01.258234 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.258208 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 10:04:01.259431 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:04:01.259405 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127e7995_5b41_480d_af98_90e6ff792104.slice/crio-63d94fbb1f4daba2a55adb7a95bf55aefcef536f47552b8d0535c66c0f404186 WatchSource:0}: Error finding container 63d94fbb1f4daba2a55adb7a95bf55aefcef536f47552b8d0535c66c0f404186: Status 404 returned error can't find the container with id 63d94fbb1f4daba2a55adb7a95bf55aefcef536f47552b8d0535c66c0f404186 Apr 20 10:04:01.744165 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.744129 2577 generic.go:358] "Generic (PLEG): container finished" podID="127e7995-5b41-480d-af98-90e6ff792104" containerID="b539c410ba572144ee799e978e9b7efbdc9e2421e10d10d02ade6d966dc7d47b" exitCode=0 Apr 20 10:04:01.744300 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.744231 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerDied","Data":"b539c410ba572144ee799e978e9b7efbdc9e2421e10d10d02ade6d966dc7d47b"} Apr 20 10:04:01.744300 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:01.744277 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"63d94fbb1f4daba2a55adb7a95bf55aefcef536f47552b8d0535c66c0f404186"} Apr 20 10:04:02.082728 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.082705 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a558aa93-21b2-497e-bbaa-ac2985f6f656" path="/var/lib/kubelet/pods/a558aa93-21b2-497e-bbaa-ac2985f6f656/volumes" Apr 20 10:04:02.751727 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"81f9837ed736dd434e74a9a8189ff44cf4b80d80398cdb1d52d31114c6ff14b0"} Apr 20 10:04:02.751895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"e8030560dab5bfdd5aed446f2d86607ccf3aeb6d9d3a4814c44b490be84623b4"} Apr 20 10:04:02.751895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"1c9f9c96a495954aac05777a61aab51f6b8b9ea9be268b8f2ec834cb27e4ec91"} Apr 20 10:04:02.751895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"656dbfc0524b8070ea2b79863e6606a40edb086b0caaf3905f25d4454304d20d"} Apr 20 10:04:02.751895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"84f693b08d832456ce6c1ee67ec43616fdbb79270a4702f40d703edc3e648fa5"} Apr 20 10:04:02.751895 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.751796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"127e7995-5b41-480d-af98-90e6ff792104","Type":"ContainerStarted","Data":"7ba05ff6bdca10d3d80a3db8bae8de846b5c911c9cafb29fe365ad9fed2fbe52"} Apr 20 10:04:02.787158 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:02.787111 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.787094588 podStartE2EDuration="2.787094588s" podCreationTimestamp="2026-04-20 10:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:04:02.784237442 +0000 UTC m=+169.207298587" watchObservedRunningTime="2026-04-20 10:04:02.787094588 +0000 UTC m=+169.210155731" Apr 20 10:04:06.126021 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:04:06.125982 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:05:01.125442 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:05:01.125408 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:05:01.141330 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:05:01.141307 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:05:01.949689 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:05:01.949644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 10:06:14.036682 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:06:14.036636 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:06:14.037265 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:06:14.036739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:06:14.041331 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:06:14.041296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:06:14.041519 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:06:14.041498 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:06:14.046164 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:06:14.046147 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 10:07:18.999148 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:18.999049 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4"] Apr 20 10:07:19.002276 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.002255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.004781 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.004749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:07:19.004902 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.004838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 20 10:07:19.005943 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.005924 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-vk4xt\"" Apr 20 10:07:19.013886 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.013863 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4"] Apr 20 10:07:19.130343 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.130316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5017922b-dc80-478e-aec0-286a2d621d81-tmp\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.130451 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.130386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz4hn\" (UniqueName: \"kubernetes.io/projected/5017922b-dc80-478e-aec0-286a2d621d81-kube-api-access-xz4hn\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.230822 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.230796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5017922b-dc80-478e-aec0-286a2d621d81-tmp\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.230942 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.230853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz4hn\" (UniqueName: \"kubernetes.io/projected/5017922b-dc80-478e-aec0-286a2d621d81-kube-api-access-xz4hn\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.231212 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.231187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5017922b-dc80-478e-aec0-286a2d621d81-tmp\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.241450 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.241422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz4hn\" (UniqueName: \"kubernetes.io/projected/5017922b-dc80-478e-aec0-286a2d621d81-kube-api-access-xz4hn\") pod \"jobset-operator-747c5859c7-zsmm4\" (UID: \"5017922b-dc80-478e-aec0-286a2d621d81\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.330742 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.330692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" Apr 20 10:07:19.451470 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.451386 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4"] Apr 20 10:07:19.454132 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:07:19.454105 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5017922b_dc80_478e_aec0_286a2d621d81.slice/crio-64af992054839cf0b14b72f78d0bc28acc095b488d070251caba96a93d71af08 WatchSource:0}: Error finding container 64af992054839cf0b14b72f78d0bc28acc095b488d070251caba96a93d71af08: Status 404 returned error can't find the container with id 64af992054839cf0b14b72f78d0bc28acc095b488d070251caba96a93d71af08 Apr 20 10:07:19.455403 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:19.455386 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:07:20.325015 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:20.324974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" event={"ID":"5017922b-dc80-478e-aec0-286a2d621d81","Type":"ContainerStarted","Data":"64af992054839cf0b14b72f78d0bc28acc095b488d070251caba96a93d71af08"} Apr 20 10:07:28.350980 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:28.350944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" event={"ID":"5017922b-dc80-478e-aec0-286a2d621d81","Type":"ContainerStarted","Data":"8e3bfc08c6ed9f93d0d060fdec365e903dd5c58b500f854f2e382f2cfbe657f9"} Apr 20 10:07:28.370697 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:07:28.370638 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-zsmm4" podStartSLOduration=2.017429478 podStartE2EDuration="10.37062589s" podCreationTimestamp="2026-04-20 10:07:18 +0000 UTC" firstStartedPulling="2026-04-20 10:07:19.455544241 +0000 UTC m=+365.878605365" lastFinishedPulling="2026-04-20 10:07:27.808740641 +0000 UTC m=+374.231801777" observedRunningTime="2026-04-20 10:07:28.368615681 +0000 UTC m=+374.791676825" watchObservedRunningTime="2026-04-20 10:07:28.37062589 +0000 UTC m=+374.793687033" Apr 20 10:11:14.061048 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:11:14.061017 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:11:14.062274 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:11:14.062250 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:11:14.065236 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:11:14.065213 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:11:14.066120 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:11:14.066093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:16:14.087972 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:16:14.087874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:16:14.089884 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:16:14.089847 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:16:14.092373 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:16:14.092350 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:16:14.094202 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:16:14.094180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:19:54.435212 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.435184 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9"] Apr 20 10:19:54.444328 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.444303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:19:54.447477 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.447442 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6brcw\"/\"kube-root-ca.crt\"" Apr 20 10:19:54.447906 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.447882 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9"] Apr 20 10:19:54.448822 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.448806 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-6brcw\"/\"default-dockercfg-4dtrh\"" Apr 20 10:19:54.448912 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.448838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6brcw\"/\"openshift-service-ca.crt\"" Apr 20 10:19:54.514673 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.514626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlcn\" (UniqueName: \"kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn\") pod \"test-trainjob-79p7n-node-0-0-lg8p9\" (UID: \"78efac0d-d03c-481b-a071-8d472bb66543\") " pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:19:54.615384 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.615353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlcn\" (UniqueName: \"kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn\") pod \"test-trainjob-79p7n-node-0-0-lg8p9\" (UID: \"78efac0d-d03c-481b-a071-8d472bb66543\") " pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:19:54.624257 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.624231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlcn\" (UniqueName: \"kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn\") pod \"test-trainjob-79p7n-node-0-0-lg8p9\" (UID: \"78efac0d-d03c-481b-a071-8d472bb66543\") " pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:19:54.754632 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.754571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:19:54.879070 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.879042 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9"] Apr 20 10:19:54.881390 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:19:54.881364 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78efac0d_d03c_481b_a071_8d472bb66543.slice/crio-4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe WatchSource:0}: Error finding container 4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe: Status 404 returned error can't find the container with id 4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe Apr 20 10:19:54.883354 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:54.883338 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:19:55.624869 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:19:55.624817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" event={"ID":"78efac0d-d03c-481b-a071-8d472bb66543","Type":"ContainerStarted","Data":"4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe"} Apr 20 10:21:14.109557 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:21:14.109527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:21:14.112852 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:21:14.112826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:21:14.113251 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:21:14.113234 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:21:14.116984 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:21:14.116969 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:26:23.759185 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:23.759156 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:26:23.800762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:23.759217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:26:23.800762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:23.772681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:26:23.800762 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:23.772695 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:26:25.802299 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:25.802257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" event={"ID":"78efac0d-d03c-481b-a071-8d472bb66543","Type":"ContainerStarted","Data":"173d3abefca165eb17c7de3f9f999459d15f7bfcdda000429d0a9214e12886c0"} Apr 20 10:26:25.805251 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:25.805236 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-6brcw\"/\"default-dockercfg-4dtrh\"" Apr 20 10:26:25.831141 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:25.831085 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" podStartSLOduration=1.7804846250000002 podStartE2EDuration="6m31.831068715s" podCreationTimestamp="2026-04-20 10:19:54 +0000 UTC" firstStartedPulling="2026-04-20 10:19:54.883484683 +0000 UTC m=+1121.306545804" lastFinishedPulling="2026-04-20 10:26:24.934068766 +0000 UTC m=+1511.357129894" observedRunningTime="2026-04-20 10:26:25.8290148 +0000 UTC m=+1512.252075944" watchObservedRunningTime="2026-04-20 10:26:25.831068715 +0000 UTC m=+1512.254129861" Apr 20 10:26:25.907983 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:25.907956 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6brcw\"/\"kube-root-ca.crt\"" Apr 20 10:26:25.917890 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:25.917863 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-6brcw\"/\"openshift-service-ca.crt\"" Apr 20 10:26:28.813287 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:28.813245 2577 generic.go:358] "Generic (PLEG): container finished" podID="78efac0d-d03c-481b-a071-8d472bb66543" containerID="173d3abefca165eb17c7de3f9f999459d15f7bfcdda000429d0a9214e12886c0" exitCode=0 Apr 20 10:26:28.813690 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:28.813324 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" event={"ID":"78efac0d-d03c-481b-a071-8d472bb66543","Type":"ContainerDied","Data":"173d3abefca165eb17c7de3f9f999459d15f7bfcdda000429d0a9214e12886c0"} Apr 20 10:26:29.948080 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:29.948055 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:26:30.037594 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.037573 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlcn\" (UniqueName: \"kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn\") pod \"78efac0d-d03c-481b-a071-8d472bb66543\" (UID: \"78efac0d-d03c-481b-a071-8d472bb66543\") " Apr 20 10:26:30.039874 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.039848 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn" (OuterVolumeSpecName: "kube-api-access-vnlcn") pod "78efac0d-d03c-481b-a071-8d472bb66543" (UID: "78efac0d-d03c-481b-a071-8d472bb66543"). InnerVolumeSpecName "kube-api-access-vnlcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:26:30.138527 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.138503 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnlcn\" (UniqueName: \"kubernetes.io/projected/78efac0d-d03c-481b-a071-8d472bb66543-kube-api-access-vnlcn\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:26:30.820561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.820491 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" Apr 20 10:26:30.820561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.820505 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9" event={"ID":"78efac0d-d03c-481b-a071-8d472bb66543","Type":"ContainerDied","Data":"4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe"} Apr 20 10:26:30.820561 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:26:30.820536 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4139370cc15d3cb4392953d1418e2a8867db96338df19abf71723b2be8d2cfbe" Apr 20 10:31:23.792067 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:31:23.791934 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:31:23.793537 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:31:23.793515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:31:23.796074 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:31:23.796043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:31:23.797690 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:31:23.797647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:36:23.817106 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:23.816951 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:36:23.821325 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:23.818506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:36:23.821524 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:23.821505 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:36:23.822702 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:23.822677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:36:29.475474 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:29.475438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-6brcw_test-trainjob-79p7n-node-0-0-lg8p9_78efac0d-d03c-481b-a071-8d472bb66543/node/0.log" Apr 20 10:36:30.939613 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.939582 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vddxz/must-gather-xf4g8"] Apr 20 10:36:30.940101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.940031 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78efac0d-d03c-481b-a071-8d472bb66543" containerName="node" Apr 20 10:36:30.940101 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.940049 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="78efac0d-d03c-481b-a071-8d472bb66543" containerName="node" Apr 20 10:36:30.940222 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.940171 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="78efac0d-d03c-481b-a071-8d472bb66543" containerName="node" Apr 20 10:36:30.943053 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.943032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:30.945742 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.945720 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vddxz\"/\"kube-root-ca.crt\"" Apr 20 10:36:30.946822 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.946803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vddxz\"/\"default-dockercfg-skfpc\"" Apr 20 10:36:30.946897 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.946809 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vddxz\"/\"openshift-service-ca.crt\"" Apr 20 10:36:30.950743 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:30.950723 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vddxz/must-gather-xf4g8"] Apr 20 10:36:31.120733 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.120705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrkp\" (UniqueName: \"kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.120848 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.120749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.221207 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.221154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrkp\" (UniqueName: \"kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.221207 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.221189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.221445 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.221431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.231098 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.231076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrkp\" (UniqueName: \"kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp\") pod \"must-gather-xf4g8\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.252247 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.252229 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:36:31.371557 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.371522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vddxz/must-gather-xf4g8"] Apr 20 10:36:31.374852 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:36:31.374823 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d5b9e1e_7b1f_44c2_b419_092091ad19de.slice/crio-66aab5a9ed95f3d3326d09bedac0b2dd4536dbe357f43f4228faf7c553f54d04 WatchSource:0}: Error finding container 66aab5a9ed95f3d3326d09bedac0b2dd4536dbe357f43f4228faf7c553f54d04: Status 404 returned error can't find the container with id 66aab5a9ed95f3d3326d09bedac0b2dd4536dbe357f43f4228faf7c553f54d04 Apr 20 10:36:31.376366 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.376349 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:36:31.664714 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:31.664680 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vddxz/must-gather-xf4g8" event={"ID":"9d5b9e1e-7b1f-44c2-b419-092091ad19de","Type":"ContainerStarted","Data":"66aab5a9ed95f3d3326d09bedac0b2dd4536dbe357f43f4228faf7c553f54d04"} Apr 20 10:36:34.507622 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:34.507591 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9"] Apr 20 10:36:34.511345 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:34.511319 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-6brcw/test-trainjob-79p7n-node-0-0-lg8p9"] Apr 20 10:36:36.082210 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:36.082115 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78efac0d-d03c-481b-a071-8d472bb66543" path="/var/lib/kubelet/pods/78efac0d-d03c-481b-a071-8d472bb66543/volumes" Apr 20 10:36:36.682539 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:36.682495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vddxz/must-gather-xf4g8" event={"ID":"9d5b9e1e-7b1f-44c2-b419-092091ad19de","Type":"ContainerStarted","Data":"94f2ffad5701682f25fb99cbdcad07f98cd02a3a18dd5f3945b1afddd81b2c5f"} Apr 20 10:36:36.682539 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:36.682536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vddxz/must-gather-xf4g8" event={"ID":"9d5b9e1e-7b1f-44c2-b419-092091ad19de","Type":"ContainerStarted","Data":"f7621dc5fa3ca60f423b7e8a8b01ead6e3b226050e42b5df430d12c92ca68ebe"} Apr 20 10:36:36.698470 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:36:36.698430 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vddxz/must-gather-xf4g8" podStartSLOduration=2.357608145 podStartE2EDuration="6.698417565s" podCreationTimestamp="2026-04-20 10:36:30 +0000 UTC" firstStartedPulling="2026-04-20 10:36:31.376473533 +0000 UTC m=+2117.799534654" lastFinishedPulling="2026-04-20 10:36:35.717282941 +0000 UTC m=+2122.140344074" observedRunningTime="2026-04-20 10:36:36.697567325 +0000 UTC m=+2123.120628470" watchObservedRunningTime="2026-04-20 10:36:36.698417565 +0000 UTC m=+2123.121478708" Apr 20 10:37:14.187583 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:14.187509 2577 scope.go:117] "RemoveContainer" containerID="173d3abefca165eb17c7de3f9f999459d15f7bfcdda000429d0a9214e12886c0" Apr 20 10:37:19.830035 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:19.830004 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerID="f7621dc5fa3ca60f423b7e8a8b01ead6e3b226050e42b5df430d12c92ca68ebe" exitCode=0 Apr 20 10:37:19.830435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:19.830053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vddxz/must-gather-xf4g8" event={"ID":"9d5b9e1e-7b1f-44c2-b419-092091ad19de","Type":"ContainerDied","Data":"f7621dc5fa3ca60f423b7e8a8b01ead6e3b226050e42b5df430d12c92ca68ebe"} Apr 20 10:37:19.830435 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:19.830319 2577 scope.go:117] "RemoveContainer" containerID="f7621dc5fa3ca60f423b7e8a8b01ead6e3b226050e42b5df430d12c92ca68ebe" Apr 20 10:37:20.324535 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:20.324505 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vddxz_must-gather-xf4g8_9d5b9e1e-7b1f-44c2-b419-092091ad19de/gather/0.log" Apr 20 10:37:24.113514 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:24.113483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5r7zs_9968b560-1fb0-4930-8c96-a8878efe7d90/global-pull-secret-syncer/0.log" Apr 20 10:37:24.378065 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:24.377990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j4g46_ff1633f6-fc91-4bfb-955c-d10341913ddc/konnectivity-agent/0.log" Apr 20 10:37:24.461112 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:24.461089 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-95.ec2.internal_36c01a4c8423a95e1f1d0a7a3f9614c6/haproxy/0.log" Apr 20 10:37:25.724523 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.724479 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vddxz/must-gather-xf4g8"] Apr 20 10:37:25.725047 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.724743 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vddxz/must-gather-xf4g8" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="copy" containerID="cri-o://94f2ffad5701682f25fb99cbdcad07f98cd02a3a18dd5f3945b1afddd81b2c5f" gracePeriod=2 Apr 20 10:37:25.727809 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.727774 2577 status_manager.go:895] "Failed to get status for pod" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" pod="openshift-must-gather-vddxz/must-gather-xf4g8" err="pods \"must-gather-xf4g8\" is forbidden: User \"system:node:ip-10-0-140-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vddxz\": no relationship found between node 'ip-10-0-140-95.ec2.internal' and this object" Apr 20 10:37:25.727967 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.727817 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vddxz/must-gather-xf4g8"] Apr 20 10:37:25.857068 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.857043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vddxz_must-gather-xf4g8_9d5b9e1e-7b1f-44c2-b419-092091ad19de/copy/0.log" Apr 20 10:37:25.857429 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.857408 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerID="94f2ffad5701682f25fb99cbdcad07f98cd02a3a18dd5f3945b1afddd81b2c5f" exitCode=143 Apr 20 10:37:25.962574 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.962550 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vddxz_must-gather-xf4g8_9d5b9e1e-7b1f-44c2-b419-092091ad19de/copy/0.log" Apr 20 10:37:25.962903 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.962888 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:37:25.965340 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:25.965317 2577 status_manager.go:895] "Failed to get status for pod" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" pod="openshift-must-gather-vddxz/must-gather-xf4g8" err="pods \"must-gather-xf4g8\" is forbidden: User \"system:node:ip-10-0-140-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vddxz\": no relationship found between node 'ip-10-0-140-95.ec2.internal' and this object" Apr 20 10:37:26.091175 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.091119 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output\") pod \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " Apr 20 10:37:26.091175 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.091153 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltrkp\" (UniqueName: \"kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp\") pod \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\" (UID: \"9d5b9e1e-7b1f-44c2-b419-092091ad19de\") " Apr 20 10:37:26.093322 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.093296 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9d5b9e1e-7b1f-44c2-b419-092091ad19de" (UID: "9d5b9e1e-7b1f-44c2-b419-092091ad19de"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:37:26.093447 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.093430 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp" (OuterVolumeSpecName: "kube-api-access-ltrkp") pod "9d5b9e1e-7b1f-44c2-b419-092091ad19de" (UID: "9d5b9e1e-7b1f-44c2-b419-092091ad19de"). InnerVolumeSpecName "kube-api-access-ltrkp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:37:26.192205 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.192182 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d5b9e1e-7b1f-44c2-b419-092091ad19de-must-gather-output\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:37:26.192205 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.192203 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltrkp\" (UniqueName: \"kubernetes.io/projected/9d5b9e1e-7b1f-44c2-b419-092091ad19de-kube-api-access-ltrkp\") on node \"ip-10-0-140-95.ec2.internal\" DevicePath \"\"" Apr 20 10:37:26.861797 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.861771 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vddxz_must-gather-xf4g8_9d5b9e1e-7b1f-44c2-b419-092091ad19de/copy/0.log" Apr 20 10:37:26.862228 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.862167 2577 scope.go:117] "RemoveContainer" containerID="94f2ffad5701682f25fb99cbdcad07f98cd02a3a18dd5f3945b1afddd81b2c5f" Apr 20 10:37:26.862297 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.862221 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vddxz/must-gather-xf4g8" Apr 20 10:37:26.871051 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:26.871028 2577 scope.go:117] "RemoveContainer" containerID="f7621dc5fa3ca60f423b7e8a8b01ead6e3b226050e42b5df430d12c92ca68ebe" Apr 20 10:37:27.481322 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:27.481293 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-t7cfx_5cd2cf42-4b4b-4260-963f-fd7f94555d35/cluster-monitoring-operator/0.log" Apr 20 10:37:27.840391 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:27.840369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4j86_cb15f011-6dbf-43b9-8367-0979ca21cb28/node-exporter/0.log" Apr 20 10:37:27.859712 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:27.859684 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4j86_cb15f011-6dbf-43b9-8367-0979ca21cb28/kube-rbac-proxy/0.log" Apr 20 10:37:27.885935 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:27.885916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4j86_cb15f011-6dbf-43b9-8367-0979ca21cb28/init-textfile/0.log" Apr 20 10:37:28.048727 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.048705 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/prometheus/0.log" Apr 20 10:37:28.074240 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.074217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/config-reloader/0.log" Apr 20 10:37:28.084028 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.083976 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" path="/var/lib/kubelet/pods/9d5b9e1e-7b1f-44c2-b419-092091ad19de/volumes" Apr 20 10:37:28.103587 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.103568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/thanos-sidecar/0.log" Apr 20 10:37:28.127791 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.127768 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/kube-rbac-proxy-web/0.log" Apr 20 10:37:28.151524 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.151508 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/kube-rbac-proxy/0.log" Apr 20 10:37:28.175673 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.175638 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/kube-rbac-proxy-thanos/0.log" Apr 20 10:37:28.202624 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:28.202608 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_127e7995-5b41-480d-af98-90e6ff792104/init-config-reloader/0.log" Apr 20 10:37:29.746916 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:29.746862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-7rdz4_5a97f42a-851b-4803-9be6-3ad666e6f307/networking-console-plugin/0.log" Apr 20 10:37:30.298310 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.298283 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:37:30.302524 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.302504 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/3.log" Apr 20 10:37:30.855377 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855349 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c"] Apr 20 10:37:30.855776 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855708 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="copy" Apr 20 10:37:30.855776 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855721 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="copy" Apr 20 10:37:30.855776 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855731 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="gather" Apr 20 10:37:30.855776 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855738 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="gather" Apr 20 10:37:30.855919 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855804 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="gather" Apr 20 10:37:30.855919 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.855814 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d5b9e1e-7b1f-44c2-b419-092091ad19de" containerName="copy" Apr 20 10:37:30.860874 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.860854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:30.866942 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.866780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"openshift-service-ca.crt\"" Apr 20 10:37:30.866942 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.866848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kqj2z\"/\"default-dockercfg-zn9bs\"" Apr 20 10:37:30.867998 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.867979 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"kube-root-ca.crt\"" Apr 20 10:37:30.883600 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:30.883580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c"] Apr 20 10:37:31.022146 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.022125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-podres\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.022252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.022152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-proc\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.022252 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.022188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-lib-modules\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.022328 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.022243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7ng\" (UniqueName: \"kubernetes.io/projected/83797b4f-014a-4b4d-b243-10650e6abac5-kube-api-access-qx7ng\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.022328 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.022316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-sys\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.122996 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.122975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-sys\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123099 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-podres\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123099 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-proc\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123173 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-sys\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123173 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-lib-modules\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123173 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-podres\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123173 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7ng\" (UniqueName: \"kubernetes.io/projected/83797b4f-014a-4b4d-b243-10650e6abac5-kube-api-access-qx7ng\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123305 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-proc\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.123305 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.123193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83797b4f-014a-4b4d-b243-10650e6abac5-lib-modules\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.138052 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.138027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7ng\" (UniqueName: \"kubernetes.io/projected/83797b4f-014a-4b4d-b243-10650e6abac5-kube-api-access-qx7ng\") pod \"perf-node-gather-daemonset-djk9c\" (UID: \"83797b4f-014a-4b4d-b243-10650e6abac5\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.170549 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.170530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.299463 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.299440 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c"] Apr 20 10:37:31.301505 ip-10-0-140-95 kubenswrapper[2577]: W0420 10:37:31.301482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod83797b4f_014a_4b4d_b243_10650e6abac5.slice/crio-ecaa16b299c19845fd21c3437cbdd527602e5702813fdd2519cc9f5b79083a11 WatchSource:0}: Error finding container ecaa16b299c19845fd21c3437cbdd527602e5702813fdd2519cc9f5b79083a11: Status 404 returned error can't find the container with id ecaa16b299c19845fd21c3437cbdd527602e5702813fdd2519cc9f5b79083a11 Apr 20 10:37:31.533933 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.533902 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-djf8k_ae249ad7-50d4-4db6-be40-535b35542e1c/volume-data-source-validator/0.log" Apr 20 10:37:31.880579 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.880541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" event={"ID":"83797b4f-014a-4b4d-b243-10650e6abac5","Type":"ContainerStarted","Data":"f3783a2e875a6246344b316298f1315e0349e4dcfc8c840f74b89bcabaf787d6"} Apr 20 10:37:31.880579 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.880575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" event={"ID":"83797b4f-014a-4b4d-b243-10650e6abac5","Type":"ContainerStarted","Data":"ecaa16b299c19845fd21c3437cbdd527602e5702813fdd2519cc9f5b79083a11"} Apr 20 10:37:31.880989 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.880691 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:31.903301 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:31.903255 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" podStartSLOduration=1.903243043 podStartE2EDuration="1.903243043s" podCreationTimestamp="2026-04-20 10:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:37:31.900885576 +0000 UTC m=+2178.323946718" watchObservedRunningTime="2026-04-20 10:37:31.903243043 +0000 UTC m=+2178.326304186" Apr 20 10:37:32.327576 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:32.327500 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqrj9_2dce1807-1577-4d4f-8a49-740ba99a59ca/dns/0.log" Apr 20 10:37:32.397947 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:32.397920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqrj9_2dce1807-1577-4d4f-8a49-740ba99a59ca/kube-rbac-proxy/0.log" Apr 20 10:37:32.663112 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:32.663088 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fr996_a533ff84-2d61-4ccb-9b58-1eea4acb387d/dns-node-resolver/0.log" Apr 20 10:37:33.367957 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:33.367929 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zsznm_2eb7affb-8768-41ec-85fb-a62a41bb8709/node-ca/0.log" Apr 20 10:37:34.309976 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:34.309948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dbc9c6698-7l896_adc46263-5b99-4162-8415-8a084543bdad/router/0.log" Apr 20 10:37:34.690777 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:34.690747 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5fzkf_84d8b916-498e-4189-840c-c6931e4b0d70/serve-healthcheck-canary/0.log" Apr 20 10:37:35.196919 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:35.196884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5fn5_9cedfb03-6b25-46db-a934-2933c2d42473/insights-operator/1.log" Apr 20 10:37:35.197370 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:35.197351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5fn5_9cedfb03-6b25-46db-a934-2933c2d42473/insights-operator/0.log" Apr 20 10:37:35.230910 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:35.230892 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84cjk_8e2995a2-df23-416f-8edb-670e8832f5ce/kube-rbac-proxy/0.log" Apr 20 10:37:35.311030 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:35.311010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84cjk_8e2995a2-df23-416f-8edb-670e8832f5ce/exporter/0.log" Apr 20 10:37:35.343170 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:35.343147 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-84cjk_8e2995a2-df23-416f-8edb-670e8832f5ce/extractor/0.log" Apr 20 10:37:37.238902 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:37.238876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-zsmm4_5017922b-dc80-478e-aec0-286a2d621d81/jobset-operator/0.log" Apr 20 10:37:37.896197 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:37.896166 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-djk9c" Apr 20 10:37:40.698878 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:40.698849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zjnpb_9701f42d-6084-4aa6-9f1d-845738e47a33/migrator/0.log" Apr 20 10:37:40.727274 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:40.727250 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zjnpb_9701f42d-6084-4aa6-9f1d-845738e47a33/graceful-termination/0.log" Apr 20 10:37:41.075228 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:41.075200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dk6lf_289379f5-7b90-499a-a7cd-14690b1bb4b1/kube-storage-version-migrator-operator/1.log" Apr 20 10:37:41.104976 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:41.104943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dk6lf_289379f5-7b90-499a-a7cd-14690b1bb4b1/kube-storage-version-migrator-operator/0.log" Apr 20 10:37:42.022751 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.022724 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/kube-multus-additional-cni-plugins/0.log" Apr 20 10:37:42.043768 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.043746 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/egress-router-binary-copy/0.log" Apr 20 10:37:42.065983 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.065962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/cni-plugins/0.log" Apr 20 10:37:42.093160 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.093145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/bond-cni-plugin/0.log" Apr 20 10:37:42.116502 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.116488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/routeoverride-cni/0.log" Apr 20 10:37:42.139844 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.139825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/whereabouts-cni-bincopy/0.log" Apr 20 10:37:42.163798 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.163782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4bp75_ada0f1d6-3214-4751-9778-3af57b7e44c0/whereabouts-cni/0.log" Apr 20 10:37:42.551239 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.551201 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kwc9j_0d20a880-ecf7-405a-98f4-141bc115d61b/kube-multus/0.log" Apr 20 10:37:42.736579 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.736555 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vs775_6a07ac99-a265-4370-a43b-b11246f741de/network-metrics-daemon/0.log" Apr 20 10:37:42.756240 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:42.756217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vs775_6a07ac99-a265-4370-a43b-b11246f741de/kube-rbac-proxy/0.log" Apr 20 10:37:43.931800 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:43.931769 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-controller/0.log" Apr 20 10:37:43.954609 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:43.954574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:37:43.964604 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:43.964581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/1.log" Apr 20 10:37:43.982296 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:43.982278 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/kube-rbac-proxy-node/0.log" Apr 20 10:37:44.001995 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:44.001968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 10:37:44.028103 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:44.028084 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/northd/0.log" Apr 20 10:37:44.048459 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:44.048444 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/nbdb/0.log" Apr 20 10:37:44.068491 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:44.068475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/sbdb/0.log" Apr 20 10:37:44.161722 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:44.161695 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovnkube-controller/0.log" Apr 20 10:37:45.587011 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:45.586970 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4dcjg_73ef9bd2-5eeb-4e74-9dfe-17214e80e475/check-endpoints/0.log" Apr 20 10:37:45.630447 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:45.630421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gm5vg_ff96330b-c86e-4eab-8d6f-a6db1b630272/network-check-target-container/0.log" Apr 20 10:37:46.549564 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:46.549526 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tjxs7_5f23702f-f4ca-4d67-8cde-3f062233913d/iptables-alerter/0.log" Apr 20 10:37:47.372250 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:47.372225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-q7mlz_18150fd2-df3c-4c5b-9a5b-726839bc0ccc/tuned/0.log" Apr 20 10:37:49.394616 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:49.394585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k822b_e27ef370-a030-44aa-a961-156382685e11/cluster-samples-operator/0.log" Apr 20 10:37:49.461857 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:49.461833 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k822b_e27ef370-a030-44aa-a961-156382685e11/cluster-samples-operator-watch/0.log" Apr 20 10:37:50.553831 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:50.553795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-79qp4_81c6d571-c228-4b53-8d5e-c96359b3d8f6/service-ca-operator/1.log" Apr 20 10:37:50.554866 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:50.554849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-79qp4_81c6d571-c228-4b53-8d5e-c96359b3d8f6/service-ca-operator/0.log" Apr 20 10:37:51.889646 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:51.889610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nztth_ab3edac7-ffce-45ee-83d2-17c48dd51ac6/csi-driver/0.log" Apr 20 10:37:51.923647 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:51.923619 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nztth_ab3edac7-ffce-45ee-83d2-17c48dd51ac6/csi-node-driver-registrar/0.log" Apr 20 10:37:51.964222 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:37:51.964197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nztth_ab3edac7-ffce-45ee-83d2-17c48dd51ac6/csi-liveness-probe/0.log" Apr 20 10:41:23.847392 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:41:23.847263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:41:23.851629 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:41:23.849805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2zgz2_db045f44-d582-4037-82eb-d656372b093e/console-operator/2.log" Apr 20 10:41:23.851851 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:41:23.851830 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log" Apr 20 10:41:23.854418 ip-10-0-140-95 kubenswrapper[2577]: I0420 10:41:23.854393 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bxbxw_deaf1642-316d-4307-8ade-dc653dd9e116/ovn-acl-logging/0.log"