Apr 22 14:15:21.027453 ip-10-0-131-75 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:21.488020 ip-10-0-131-75 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:21.488020 ip-10-0-131-75 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:21.488020 ip-10-0-131-75 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:21.488020 ip-10-0-131-75 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:21.488020 ip-10-0-131-75 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:21.489813 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.489728 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:21.491967 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491953 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491969 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491973 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491976 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491980 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491983 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491985 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491988 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491990 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491993 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491995 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.491998 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492002 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492007 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:21.492003 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492010 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492014 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492017 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492020 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492022 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492026 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492030 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492039 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492042 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492045 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492048 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492050 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492053 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492055 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492058 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492061 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492063 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492065 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492068 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:21.492347 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492070 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492072 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492075 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492077 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492080 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492082 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492084 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492087 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492090 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492092 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492095 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492097 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492099 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492102 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492104 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492108 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492118 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492121 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492124 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492127 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:21.492794 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492130 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492133 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492136 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492138 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492141 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492143 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492146 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492149 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492151 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492154 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492156 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492160 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492163 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492165 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492168 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492170 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492173 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492187 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492190 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492193 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:21.493294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492196 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492202 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492205 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492208 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492210 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492213 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492215 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492220 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492223 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492226 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492228 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492231 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492234 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492592 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492597 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492600 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492602 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492605 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492608 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492611 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:21.493767 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492613 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492616 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492619 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492621 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492624 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492627 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492629 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492632 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492634 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492637 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492639 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492642 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492645 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492649 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492652 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492655 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492657 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492660 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492662 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492665 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:21.494266 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492668 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492670 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492673 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492675 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492678 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492680 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492682 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492685 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492687 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492690 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492692 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492695 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492697 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492700 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492702 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492705 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492708 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492710 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492713 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492715 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:21.494747 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492718 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492720 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492722 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492725 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492727 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492730 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492733 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492735 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492739 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492741 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492743 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492746 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492749 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492752 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492754 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492756 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492759 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492761 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492764 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:21.495232 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492767 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492771 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492775 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492777 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492780 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492783 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492786 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492788 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492791 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492793 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492796 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492799 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492801 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492804 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492806 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492809 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492812 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492814 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492816 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.492819 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:21.495685 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492891 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492903 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492909 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492914 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492918 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492922 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492926 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492931 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492934 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492938 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492941 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492944 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492947 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492950 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492953 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492956 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492959 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492962 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492965 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492970 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492975 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492978 2577 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492981 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492985 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:21.496166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492989 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492992 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492995 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.492999 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493002 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493005 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493007 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493014 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493017 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493021 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493024 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493027 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493030 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493033 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493036 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493040 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493043 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493046 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493049 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493052 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493055 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493058 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493061 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493064 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493067 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:21.496808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493070 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493072 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493075 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493080 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493083 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493085 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493089 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493092 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493095 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493098 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493101 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493104 2577 flags.go:64] FLAG: --help="false" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493107 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493110 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493114 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493117 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493120 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493124 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493127 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493129 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493132 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493135 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493138 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493141 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:21.497418 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493144 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493147 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493150 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493153 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493155 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493158 2577 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493161 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493164 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493167 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493171 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493186 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493191 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493194 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493196 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493200 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493203 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493206 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493213 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493216 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493220 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493222 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493225 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493230 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493233 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493236 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:21.498008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493239 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493242 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493249 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493253 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493255 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493258 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493261 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493267 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493270 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493273 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493276 2577 flags.go:64] FLAG: --port="10250" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493279 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493282 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a80f9a9733d48f00" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493285 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493288 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493291 2577 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493294 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493297 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493302 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493305 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493308 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493310 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493314 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493317 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493320 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:21.498617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493323 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493326 2577 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493328 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493331 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493334 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493339 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493342 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493344 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493348 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493351 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493354 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493357 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493360 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493363 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493366 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493368 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493371 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493376 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493379 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493382 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493386 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493389 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493391 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493394 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493397 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:21.499247 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493401 2577 flags.go:64] FLAG: --v="2" Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493406 2577 flags.go:64] FLAG: --version="false" Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493410 2577 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493414 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493417 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493507 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493510 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493513 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493516 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493520 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493524 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493527 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493531 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493534 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493537 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493540 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493543 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493550 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493553 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493556 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:21.499875 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493558 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493561 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493564 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493567 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493570 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493572 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493575 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493577 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493580 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493582 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493585 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493587 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493591 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493594 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493596 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493599 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493601 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493604 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493606 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493609 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:21.500372 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493611 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493614 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493616 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493619 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493622 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493625 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493628 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493630 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493633 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493635 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493638 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493641 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493643 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493646 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493648 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493651 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493653 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493656 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493658 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493661 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:21.500863 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493663 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493665 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493668 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493670 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493674 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493676 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493679 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493681 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493684 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493686 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493689 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493691 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493694 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493696 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493698 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493701 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493706 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493709 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493712 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493715 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:21.501351 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493718 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493720 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493723 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493726 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493728 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493731 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493733 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493736 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493738 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493740 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.493743 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.493752 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.500945 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.500966 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501016 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501022 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:21.501835 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501027 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501032 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501037 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501040 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501043 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501045 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501048 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501051 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501054 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501056 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501059 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501062 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501065 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501067 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501070 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501072 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501076 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501079 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501081 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501084 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:21.502263 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501086 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501089 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501091 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501093 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501097 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501100 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501103 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501107 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501112 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501118 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501120 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501123 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501126 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501129 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501131 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501134 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501136 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501139 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501141 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:21.502784 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501144 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501146 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501149 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501153 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501157 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501160 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501163 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501166 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501168 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501171 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501192 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501196 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501200 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501202 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501205 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501207 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501210 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501213 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501215 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:21.503259 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501218 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501221 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501223 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501226 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501230 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501232 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501235 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501237 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501240 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501242 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501245 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501248 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501252 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501255 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501259 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501263 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501267 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501271 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501275 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501278 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:21.503724 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501280 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501283 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501285 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501289 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501292 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501294 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.501300 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501782 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501788 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501791 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501794 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501797 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501800 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501802 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501805 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501807 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:21.504209 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501810 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501814 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501816 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501819 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501821 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501826 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501830 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501834 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501838 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501841 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501844 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501847 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501849 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501852 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501854 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501857 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501859 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501862 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501864 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501869 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:21.504600 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501873 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501876 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501878 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501881 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501884 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501886 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501889 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501891 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501894 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501896 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501899 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501902 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501906 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501911 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501916 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501918 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501921 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501924 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501926 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501929 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:21.505073 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501931 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501934 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501936 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501939 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501941 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501943 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501946 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501948 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501951 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501954 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501956 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501959 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501961 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501964 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501967 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501971 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501973 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501976 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501978 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:21.505551 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501982 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501986 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501990 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501994 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.501997 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502000 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502002 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502005 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502009 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502018 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502022 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502024 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502027 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502029 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502032 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502034 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502037 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:21.506006 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:21.502039 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.502044 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.502806 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.504836 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.505757 2577 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.505852 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:21.506503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.506221 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:21.533033 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.533010 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:21.536929 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.536905 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:21.556129 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.555970 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:21.562171 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.562150 2577 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:21.562650 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.562633 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:21.563702 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.563679 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:21.568254 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.568230 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 93df9930-e2e8-4fb2-9761-bf2377f5f3e8:/dev/nvme0n1p4 e8753619-5448-44ca-8656-7f51c651338a:/dev/nvme0n1p3] Apr 22 14:15:21.568337 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.568250 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:21.574995 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.574884 2577 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:21.57234637 +0000 UTC m=+0.419854346 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200119 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fe14ef6364361a13b7948ffe40893 SystemUUID:ec2fe14e-f636-4361-a13b-7948ffe40893 BootID:c09e6b31-fabc-41f6-a553-805176ff84cb Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b2:da:69:3a:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b2:da:69:3a:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:a5:00:ec:89:0c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:21.574995 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.574992 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:21.575102 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.575080 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:21.576274 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.576246 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:21.576429 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.576277 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-75.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:21.576476 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.576439 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:21.576476 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.576446 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:21.576476 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.576462 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:21.577259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.577248 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:21.578612 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.578602 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:21.578722 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.578713 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:21.581349 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.581338 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:21.582093 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.582081 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:21.582128 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.582106 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:21.582128 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.582118 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:21.582197 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.582129 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:21.582732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.582708 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nst8w" Apr 22 14:15:21.583269 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.583249 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:21.583332 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.583276 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:21.586811 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.586788 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:21.588090 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.588076 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:21.589837 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589824 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589842 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589852 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589861 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589869 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589875 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589880 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589886 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589894 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:21.589899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589900 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:21.590210 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589909 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:21.590210 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.589918 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:21.592208 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.592194 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:21.592208 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.592205 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:21.592875 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.592809 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-75.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:21.592937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.592909 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:21.592937 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.592914 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nst8w" Apr 22 14:15:21.595944 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.595930 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:21.596001 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.595967 2577 server.go:1295] "Started kubelet" Apr 22 14:15:21.596113 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.596064 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:21.596191 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.596129 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:21.596235 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.596215 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:21.596784 ip-10-0-131-75 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:21.598037 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.598022 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:21.598970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.598952 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:21.602625 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.602604 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:21.603237 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.603219 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:21.604076 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604056 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:21.604203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604153 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:21.604203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604167 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:21.604203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604155 2577 factory.go:55] Registering systemd factory Apr 22 14:15:21.604339 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604247 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:21.604382 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604351 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:21.604382 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604360 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:21.604670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604653 2577 factory.go:153] Registering CRI-O factory Apr 22 14:15:21.604670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604670 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:21.604792 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604733 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:21.604792 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604760 2577 factory.go:103] Registering Raw factory Apr 22 14:15:21.604792 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.604777 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:21.605217 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.605201 2577 manager.go:319] Starting recovery of all containers Apr 22 14:15:21.605302 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.605279 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:21.607368 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.607328 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:21.608318 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.608300 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:21.610378 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.610123 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-75.ec2.internal" not found Apr 22 14:15:21.610378 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.610128 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-75.ec2.internal\" not found" node="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.616193 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.616162 2577 manager.go:324] Recovery completed Apr 22 14:15:21.620387 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.620376 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.622731 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.622716 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.622797 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.622745 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.622797 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.622758 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.623215 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.623203 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:21.623215 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.623213 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:21.623290 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.623230 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:21.625007 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.624995 2577 policy_none.go:49] "None policy: Start" Apr 22 14:15:21.625061 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.625010 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:21.625061 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.625020 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:21.627994 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.627979 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-75.ec2.internal" not found Apr 22 14:15:21.667810 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.667793 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.667832 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.667845 2577 server.go:85] "Starting device plugin registration server" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.668105 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.668119 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.668252 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.668331 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.668338 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.669246 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:21.686668 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.669302 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:21.688997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.688978 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-75.ec2.internal" not found Apr 22 14:15:21.741828 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.741734 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:21.743000 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.742977 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:21.743000 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.743005 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:21.743200 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.743027 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:21.743200 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.743036 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:21.743200 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.743077 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:21.746272 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.746256 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:21.768591 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.768566 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.769525 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.769510 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.769610 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.769538 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.769610 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.769549 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.769610 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.769572 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.779669 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.779650 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.779722 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.779675 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-75.ec2.internal\": node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:21.795595 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.795572 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:21.843417 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.843385 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal"] Apr 22 14:15:21.843493 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.843481 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.844421 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.844407 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.844485 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.844438 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.844485 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.844448 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.846224 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846210 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.846346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.846395 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846359 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.846920 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846904 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.847028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846927 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.847028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846941 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.847028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846945 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.847028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846962 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.847028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.846970 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.848593 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.848577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.848677 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.848602 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:21.849238 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.849224 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:21.849310 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.849247 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:21.849310 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.849261 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:21.870761 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.870729 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-75.ec2.internal\" not found" node="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.874796 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.874773 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-75.ec2.internal\" not found" node="ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.895977 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.895948 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:21.905689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.905662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.905803 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.905695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.905803 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:21.905719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab777a15bb250a55fa506fcc23a847b8-config\") pod \"kube-apiserver-proxy-ip-10-0-131-75.ec2.internal\" (UID: \"ab777a15bb250a55fa506fcc23a847b8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:21.996974 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:21.996892 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.005853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.005828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.005962 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.005858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.005962 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.005880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab777a15bb250a55fa506fcc23a847b8-config\") pod \"kube-apiserver-proxy-ip-10-0-131-75.ec2.internal\" (UID: \"ab777a15bb250a55fa506fcc23a847b8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.005962 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.005945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.006090 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.005985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c3613a9a7724d83b7f562a919909906-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal\" (UID: \"3c3613a9a7724d83b7f562a919909906\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.006090 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.006022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab777a15bb250a55fa506fcc23a847b8-config\") pod \"kube-apiserver-proxy-ip-10-0-131-75.ec2.internal\" (UID: \"ab777a15bb250a55fa506fcc23a847b8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.097934 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.097892 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.173390 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.173362 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.176928 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.176909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.198239 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.198216 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.298806 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.298719 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.399264 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.399239 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.499750 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.499717 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-75.ec2.internal\" not found" Apr 22 14:15:22.506024 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.506007 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:22.506165 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.506145 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:22.506217 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.506154 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:22.563633 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.563565 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:22.582807 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.582784 2577 apiserver.go:52] "Watching apiserver" Apr 22 14:15:22.591507 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.591484 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:22.593162 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.593137 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sfm8m","openshift-network-diagnostics/network-check-target-dg6v9","openshift-ovn-kubernetes/ovnkube-node-6zgvp","kube-system/konnectivity-agent-zcvnd","openshift-dns/node-resolver-dtg5l","openshift-network-operator/iptables-alerter-zf5kn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr","openshift-cluster-node-tuning-operator/tuned-x5ds9","openshift-image-registry/node-ca-5hkxf","openshift-multus/multus-additional-cni-plugins-pq7wr","openshift-multus/multus-bs9cg"] Apr 22 14:15:22.594804 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.594777 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:21 +0000 UTC" deadline="2027-10-12 14:47:36.193578529 +0000 UTC" Apr 22 14:15:22.594804 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.594799 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12912h32m13.598781832s" Apr 22 14:15:22.597489 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.597466 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.597601 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.597549 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:22.597654 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.597612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:22.597698 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.597670 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:22.599707 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.599684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.601734 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.601713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.602752 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.602736 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:22.603131 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:22.603131 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.603337 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:22.603395 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603383 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.603596 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603474 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:22.603596 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603488 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:22.603596 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603504 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dsq29\"" Apr 22 14:15:22.603596 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603520 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.603869 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.603855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.604046 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.604030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:22.604165 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.604151 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8rdvj\"" Apr 22 14:15:22.604248 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.604211 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:22.605974 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.605961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.606905 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.606880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.607001 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.606932 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.607079 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.607058 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-972hd\"" Apr 22 14:15:22.608479 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608459 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.608574 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608543 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.608649 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608460 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gn89g\"" Apr 22 14:15:22.608795 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:22.608952 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-systemd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609027 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.608967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609027 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebee87d9-f911-404c-9e1c-6e244d6b60cd-konnectivity-ca\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.609139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-config\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609258 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebee87d9-f911-404c-9e1c-6e244d6b60cd-agent-certs\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.609258 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-host-slash\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.609399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-var-lib-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-etc-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-ovn\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hwb\" (UniqueName: \"kubernetes.io/projected/5e382d5b-073e-4cd5-adc4-f9741cc073d8-kube-api-access-k6hwb\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.609581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:22.609581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-netd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.609581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-systemd-units\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609806 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609616 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-env-overrides\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609806 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87f925f7-d447-4a1f-b742-10a72c9ef6a9-tmp-dir\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.609806 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-netns\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609806 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-node-log\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609929 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-bin\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609929 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-script-lib\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.609929 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-iptables-alerter-script\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.610048 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.609952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-log-socket\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.610048 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovn-node-metrics-cert\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.610149 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6khw\" (UniqueName: \"kubernetes.io/projected/e13fc5ca-d417-47c6-8b6c-63651dc87d31-kube-api-access-z6khw\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.610213 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87f925f7-d447-4a1f-b742-10a72c9ef6a9-hosts-file\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.610271 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgz8\" (UniqueName: \"kubernetes.io/projected/87f925f7-d447-4a1f-b742-10a72c9ef6a9-kube-api-access-scgz8\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.610327 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkm6j\" (UniqueName: \"kubernetes.io/projected/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-kube-api-access-lkm6j\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.610380 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-kubelet\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.610433 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-slash\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.611207 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.611207 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.610953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.612969 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.612940 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.613198 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:22.613326 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.613415 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.613519 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613508 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8tqhs\"" Apr 22 14:15:22.613692 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74vdz\"" Apr 22 14:15:22.613779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.613751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.614832 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.614815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.615485 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.615468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fkrmd\"" Apr 22 14:15:22.615801 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.615780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.615801 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.615798 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:22.615921 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.615886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.617779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.617760 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal"] Apr 22 14:15:22.617845 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.617799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.617944 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.617800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.619396 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.619382 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:22.619461 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.619450 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" Apr 22 14:15:22.620167 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:22.620527 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:22.620563 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620537 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:22.620606 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4mvbk\"" Apr 22 14:15:22.620795 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:22.620832 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620787 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:22.620997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.620986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnq4b\"" Apr 22 14:15:22.621077 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.621063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:22.621833 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.621819 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:22.628577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.628558 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal"] Apr 22 14:15:22.628670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.628657 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:22.652968 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.652950 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pgz67" Apr 22 14:15:22.662091 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.662071 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pgz67" Apr 22 14:15:22.702362 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.702335 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3613a9a7724d83b7f562a919909906.slice/crio-51079cc137cada9cd984c60a70a15fd2e5de7b485caa08d54b316e41b0ba037b WatchSource:0}: Error finding container 51079cc137cada9cd984c60a70a15fd2e5de7b485caa08d54b316e41b0ba037b: Status 404 returned error can't find the container with id 51079cc137cada9cd984c60a70a15fd2e5de7b485caa08d54b316e41b0ba037b Apr 22 14:15:22.704614 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.704598 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:22.707928 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.707908 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:22.710547 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-var-lib-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710611 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-ovn\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710611 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v47v8\" (UniqueName: \"kubernetes.io/projected/de6e4092-f486-48f9-b9c5-7b146b3d9c83-kube-api-access-v47v8\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.710694 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-kubelet\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.710694 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-ovn\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710694 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-var-lib-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710694 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-hostroot\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-etc-kubernetes\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-device-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hwb\" (UniqueName: \"kubernetes.io/projected/5e382d5b-073e-4cd5-adc4-f9741cc073d8-kube-api-access-k6hwb\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-netd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-multus-daemon-config\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-netd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.710871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-env-overrides\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87f925f7-d447-4a1f-b742-10a72c9ef6a9-tmp-dir\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-socket-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.710943 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.710982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-conf\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.711037 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:23.210995765 +0000 UTC m=+2.058503748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cnibin\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4532406b-2b3c-4280-be31-a1a417b34d6c-serviceca\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-node-log\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-bin\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-node-log\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-netns\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-cni-bin\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-tmp\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovn-node-metrics-cert\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-env-overrides\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87f925f7-d447-4a1f-b742-10a72c9ef6a9-tmp-dir\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-run\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-slash\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-systemd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebee87d9-f911-404c-9e1c-6e244d6b60cd-konnectivity-ca\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-systemd\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-system-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-slash\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-os-release\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-config\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.711970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebee87d9-f911-404c-9e1c-6e244d6b60cd-agent-certs\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-host-slash\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711624 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-cni-binary-copy\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtp92\" (UniqueName: \"kubernetes.io/projected/dbe24998-6780-4517-aeb1-716266573102-kube-api-access-qtp92\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4532406b-2b3c-4280-be31-a1a417b34d6c-host\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6q8h\" (UniqueName: \"kubernetes.io/projected/4532406b-2b3c-4280-be31-a1a417b34d6c-kube-api-access-d6q8h\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-etc-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-multus\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-conf-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-registration-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysconfig\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-lib-modules\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711945 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-var-lib-kubelet\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ebee87d9-f911-404c-9e1c-6e244d6b60cd-konnectivity-ca\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.712887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-config\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.711984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-tuned\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-etc-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ln2\" (UniqueName: \"kubernetes.io/projected/77146340-5d2a-4222-813e-ac3db16a7bcc-kube-api-access-82ln2\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-host-slash\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-systemd-units\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712565 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-systemd-units\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-socket-dir-parent\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-run-openvswitch\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-systemd\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-host\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-netns\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-kubernetes\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-bin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-script-lib\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.713670 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-run-netns\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-iptables-alerter-script\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-system-cni-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-cnibin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-k8s-cni-cncf-io\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gzmp\" (UniqueName: \"kubernetes.io/projected/cdaf48d9-50eb-4523-bd5a-3de107220028-kube-api-access-6gzmp\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.712981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-log-socket\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6khw\" (UniqueName: \"kubernetes.io/projected/e13fc5ca-d417-47c6-8b6c-63651dc87d31-kube-api-access-z6khw\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87f925f7-d447-4a1f-b742-10a72c9ef6a9-hosts-file\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scgz8\" (UniqueName: \"kubernetes.io/projected/87f925f7-d447-4a1f-b742-10a72c9ef6a9-kube-api-access-scgz8\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkm6j\" (UniqueName: \"kubernetes.io/projected/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-kube-api-access-lkm6j\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-sys-fs\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-sys\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-kubelet\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.714496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-modprobe-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-os-release\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-binary-copy\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovnkube-script-lib\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-host-kubelet\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-multus-certs\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e13fc5ca-d417-47c6-8b6c-63651dc87d31-log-socket\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.713930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87f925f7-d447-4a1f-b742-10a72c9ef6a9-hosts-file\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.714108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-iptables-alerter-script\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.714127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13fc5ca-d417-47c6-8b6c-63651dc87d31-ovn-node-metrics-cert\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.714417 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab777a15bb250a55fa506fcc23a847b8.slice/crio-69758585c50afd4f2fe8d6c67a726232dd6b5dc844bff9603ce2887a25ecd5ee WatchSource:0}: Error finding container 69758585c50afd4f2fe8d6c67a726232dd6b5dc844bff9603ce2887a25ecd5ee: Status 404 returned error can't find the container with id 69758585c50afd4f2fe8d6c67a726232dd6b5dc844bff9603ce2887a25ecd5ee Apr 22 14:15:22.715155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.714476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ebee87d9-f911-404c-9e1c-6e244d6b60cd-agent-certs\") pod \"konnectivity-agent-zcvnd\" (UID: \"ebee87d9-f911-404c-9e1c-6e244d6b60cd\") " pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.725860 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.725842 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:22.725937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.725863 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:22.725937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.725876 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:22.725937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:22.725933 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:23.225914672 +0000 UTC m=+2.073422637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:22.727283 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.727262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hwb\" (UniqueName: \"kubernetes.io/projected/5e382d5b-073e-4cd5-adc4-f9741cc073d8-kube-api-access-k6hwb\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:22.727424 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.727369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkm6j\" (UniqueName: \"kubernetes.io/projected/c56bbf69-677d-48ac-9bdd-3f2234c4ebe1-kube-api-access-lkm6j\") pod \"iptables-alerter-zf5kn\" (UID: \"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1\") " pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.728148 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.728131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgz8\" (UniqueName: \"kubernetes.io/projected/87f925f7-d447-4a1f-b742-10a72c9ef6a9-kube-api-access-scgz8\") pod \"node-resolver-dtg5l\" (UID: \"87f925f7-d447-4a1f-b742-10a72c9ef6a9\") " pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.728290 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.728276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6khw\" (UniqueName: \"kubernetes.io/projected/e13fc5ca-d417-47c6-8b6c-63651dc87d31-kube-api-access-z6khw\") pod \"ovnkube-node-6zgvp\" (UID: \"e13fc5ca-d417-47c6-8b6c-63651dc87d31\") " pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.745374 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.745343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" event={"ID":"3c3613a9a7724d83b7f562a919909906","Type":"ContainerStarted","Data":"51079cc137cada9cd984c60a70a15fd2e5de7b485caa08d54b316e41b0ba037b"} Apr 22 14:15:22.746245 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.746227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" event={"ID":"ab777a15bb250a55fa506fcc23a847b8","Type":"ContainerStarted","Data":"69758585c50afd4f2fe8d6c67a726232dd6b5dc844bff9603ce2887a25ecd5ee"} Apr 22 14:15:22.815107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-kubernetes\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-bin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-system-cni-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-cnibin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-system-cni-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-bin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-kubernetes\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-cnibin\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-k8s-cni-cncf-io\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gzmp\" (UniqueName: \"kubernetes.io/projected/cdaf48d9-50eb-4523-bd5a-3de107220028-kube-api-access-6gzmp\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-k8s-cni-cncf-io\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-sys-fs\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.815391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-sys\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-sys-fs\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-modprobe-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-os-release\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-binary-copy\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-sys\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-multus-certs\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-modprobe-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-os-release\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v47v8\" (UniqueName: \"kubernetes.io/projected/de6e4092-f486-48f9-b9c5-7b146b3d9c83-kube-api-access-v47v8\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-multus-certs\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-kubelet\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-kubelet\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.815839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-hostroot\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-etc-kubernetes\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-hostroot\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-device-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-etc-kubernetes\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-multus-daemon-config\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-device-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-socket-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-conf\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cnibin\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4532406b-2b3c-4280-be31-a1a417b34d6c-serviceca\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cnibin\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-netns\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815920 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-conf\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-tmp\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.816600 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815805 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-run-netns\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-run\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-system-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-run\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.815857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-socket-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-os-release\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de6e4092-f486-48f9-b9c5-7b146b3d9c83-cni-binary-copy\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-cni-binary-copy\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-os-release\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtp92\" (UniqueName: \"kubernetes.io/projected/dbe24998-6780-4517-aeb1-716266573102-kube-api-access-qtp92\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4532406b-2b3c-4280-be31-a1a417b34d6c-host\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-system-cni-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6q8h\" (UniqueName: \"kubernetes.io/projected/4532406b-2b3c-4280-be31-a1a417b34d6c-kube-api-access-d6q8h\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4532406b-2b3c-4280-be31-a1a417b34d6c-host\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-multus\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-conf-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-registration-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysconfig\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4532406b-2b3c-4280-be31-a1a417b34d6c-serviceca\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-lib-modules\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-conf-dir\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-multus-daemon-config\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-var-lib-kubelet\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-etc-selinux\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-host-var-lib-cni-multus\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdaf48d9-50eb-4523-bd5a-3de107220028-registration-dir\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-tuned\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-lib-modules\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82ln2\" (UniqueName: \"kubernetes.io/projected/77146340-5d2a-4222-813e-ac3db16a7bcc-kube-api-access-82ln2\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysconfig\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-sysctl-d\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-var-lib-kubelet\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.817589 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-socket-dir-parent\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbe24998-6780-4517-aeb1-716266573102-cni-binary-copy\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-systemd\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-host\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbe24998-6780-4517-aeb1-716266573102-multus-socket-dir-parent\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-systemd\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.816659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77146340-5d2a-4222-813e-ac3db16a7bcc-host\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.818068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.817966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-tmp\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.818274 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.818159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77146340-5d2a-4222-813e-ac3db16a7bcc-etc-tuned\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.824949 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.824651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v47v8\" (UniqueName: \"kubernetes.io/projected/de6e4092-f486-48f9-b9c5-7b146b3d9c83-kube-api-access-v47v8\") pod \"multus-additional-cni-plugins-pq7wr\" (UID: \"de6e4092-f486-48f9-b9c5-7b146b3d9c83\") " pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:22.825716 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.825317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ln2\" (UniqueName: \"kubernetes.io/projected/77146340-5d2a-4222-813e-ac3db16a7bcc-kube-api-access-82ln2\") pod \"tuned-x5ds9\" (UID: \"77146340-5d2a-4222-813e-ac3db16a7bcc\") " pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.825716 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.825454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gzmp\" (UniqueName: \"kubernetes.io/projected/cdaf48d9-50eb-4523-bd5a-3de107220028-kube-api-access-6gzmp\") pod \"aws-ebs-csi-driver-node-n5lbr\" (UID: \"cdaf48d9-50eb-4523-bd5a-3de107220028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.826427 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.826409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6q8h\" (UniqueName: \"kubernetes.io/projected/4532406b-2b3c-4280-be31-a1a417b34d6c-kube-api-access-d6q8h\") pod \"node-ca-5hkxf\" (UID: \"4532406b-2b3c-4280-be31-a1a417b34d6c\") " pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:22.827414 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.827398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtp92\" (UniqueName: \"kubernetes.io/projected/dbe24998-6780-4517-aeb1-716266573102-kube-api-access-qtp92\") pod \"multus-bs9cg\" (UID: \"dbe24998-6780-4517-aeb1-716266573102\") " pod="openshift-multus/multus-bs9cg" Apr 22 14:15:22.927543 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.927513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:22.933304 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.933284 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13fc5ca_d417_47c6_8b6c_63651dc87d31.slice/crio-bcf52bb82b5cdd7ddbd8fb89a55480fde687b58892cf7fcdb111ecf2e90ea050 WatchSource:0}: Error finding container bcf52bb82b5cdd7ddbd8fb89a55480fde687b58892cf7fcdb111ecf2e90ea050: Status 404 returned error can't find the container with id bcf52bb82b5cdd7ddbd8fb89a55480fde687b58892cf7fcdb111ecf2e90ea050 Apr 22 14:15:22.940369 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.940352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:22.946488 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.946465 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebee87d9_f911_404c_9e1c_6e244d6b60cd.slice/crio-6f89356112c19cf344b3970f603fe25f97ba1648b48b73ee01ef3c6020c4f904 WatchSource:0}: Error finding container 6f89356112c19cf344b3970f603fe25f97ba1648b48b73ee01ef3c6020c4f904: Status 404 returned error can't find the container with id 6f89356112c19cf344b3970f603fe25f97ba1648b48b73ee01ef3c6020c4f904 Apr 22 14:15:22.958256 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.958236 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dtg5l" Apr 22 14:15:22.963921 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.963905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf5kn" Apr 22 14:15:22.964246 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.964215 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f925f7_d447_4a1f_b742_10a72c9ef6a9.slice/crio-5c18e18b700db6a994a17cb4afdee9a53600b6726938c325c914f3bb0040e88d WatchSource:0}: Error finding container 5c18e18b700db6a994a17cb4afdee9a53600b6726938c325c914f3bb0040e88d: Status 404 returned error can't find the container with id 5c18e18b700db6a994a17cb4afdee9a53600b6726938c325c914f3bb0040e88d Apr 22 14:15:22.970702 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.970682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56bbf69_677d_48ac_9bdd_3f2234c4ebe1.slice/crio-7e846e9667fef1e64bc1962810ef88da6f6be0b08c33be2a005e886efddab206 WatchSource:0}: Error finding container 7e846e9667fef1e64bc1962810ef88da6f6be0b08c33be2a005e886efddab206: Status 404 returned error can't find the container with id 7e846e9667fef1e64bc1962810ef88da6f6be0b08c33be2a005e886efddab206 Apr 22 14:15:22.978961 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.978944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" Apr 22 14:15:22.984230 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:22.984206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" Apr 22 14:15:22.984360 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.984343 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdaf48d9_50eb_4523_bd5a_3de107220028.slice/crio-f51e989ab7272eb8fe5585baf4b9bc6267259cfcdd65a10d417752221f67b8ec WatchSource:0}: Error finding container f51e989ab7272eb8fe5585baf4b9bc6267259cfcdd65a10d417752221f67b8ec: Status 404 returned error can't find the container with id f51e989ab7272eb8fe5585baf4b9bc6267259cfcdd65a10d417752221f67b8ec Apr 22 14:15:22.991290 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:22.991269 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77146340_5d2a_4222_813e_ac3db16a7bcc.slice/crio-60dedaa3757b751150633268badbf241da53d6e6868081973cd7342460733f58 WatchSource:0}: Error finding container 60dedaa3757b751150633268badbf241da53d6e6868081973cd7342460733f58: Status 404 returned error can't find the container with id 60dedaa3757b751150633268badbf241da53d6e6868081973cd7342460733f58 Apr 22 14:15:23.007371 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.007357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5hkxf" Apr 22 14:15:23.012294 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:23.012274 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4532406b_2b3c_4280_be31_a1a417b34d6c.slice/crio-a1323a492aa49f62a0861c46cb6980b3617b3b1972d7a964a3d0009ec67c36e6 WatchSource:0}: Error finding container a1323a492aa49f62a0861c46cb6980b3617b3b1972d7a964a3d0009ec67c36e6: Status 404 returned error can't find the container with id a1323a492aa49f62a0861c46cb6980b3617b3b1972d7a964a3d0009ec67c36e6 Apr 22 14:15:23.017921 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.017907 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" Apr 22 14:15:23.022699 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.022681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bs9cg" Apr 22 14:15:23.022910 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:23.022881 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde6e4092_f486_48f9_b9c5_7b146b3d9c83.slice/crio-44501a200086ca65304bccee8bc4ab53a345da0f3c9a021a9f84c070fa4bd9cc WatchSource:0}: Error finding container 44501a200086ca65304bccee8bc4ab53a345da0f3c9a021a9f84c070fa4bd9cc: Status 404 returned error can't find the container with id 44501a200086ca65304bccee8bc4ab53a345da0f3c9a021a9f84c070fa4bd9cc Apr 22 14:15:23.028290 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:23.028273 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe24998_6780_4517_aeb1_716266573102.slice/crio-f18c69bd96823c1bd60aef8fda69a966d3afdd111852e6faefc336fdbd6b59a4 WatchSource:0}: Error finding container f18c69bd96823c1bd60aef8fda69a966d3afdd111852e6faefc336fdbd6b59a4: Status 404 returned error can't find the container with id f18c69bd96823c1bd60aef8fda69a966d3afdd111852e6faefc336fdbd6b59a4 Apr 22 14:15:23.158083 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.157988 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:23.219668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.219644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:23.219812 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.219765 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:23.219874 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.219817 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:24.219800032 +0000 UTC m=+3.067308017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:23.320242 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.319999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:23.320242 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.320171 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:23.320242 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.320212 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:23.320242 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.320226 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:23.320553 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.320294 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:24.320264138 +0000 UTC m=+3.167772102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:23.663612 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.663577 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:22 +0000 UTC" deadline="2028-01-22 05:17:03.564031882 +0000 UTC" Apr 22 14:15:23.663612 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.663611 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15351h1m39.900424635s" Apr 22 14:15:23.746013 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.745978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:23.746203 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:23.746100 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:23.783118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.783082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bs9cg" event={"ID":"dbe24998-6780-4517-aeb1-716266573102","Type":"ContainerStarted","Data":"f18c69bd96823c1bd60aef8fda69a966d3afdd111852e6faefc336fdbd6b59a4"} Apr 22 14:15:23.797581 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.797518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerStarted","Data":"44501a200086ca65304bccee8bc4ab53a345da0f3c9a021a9f84c070fa4bd9cc"} Apr 22 14:15:23.821346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.821313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5hkxf" event={"ID":"4532406b-2b3c-4280-be31-a1a417b34d6c","Type":"ContainerStarted","Data":"a1323a492aa49f62a0861c46cb6980b3617b3b1972d7a964a3d0009ec67c36e6"} Apr 22 14:15:23.836693 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.836570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" event={"ID":"77146340-5d2a-4222-813e-ac3db16a7bcc","Type":"ContainerStarted","Data":"60dedaa3757b751150633268badbf241da53d6e6868081973cd7342460733f58"} Apr 22 14:15:23.857032 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.857009 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:23.860676 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.860613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf5kn" event={"ID":"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1","Type":"ContainerStarted","Data":"7e846e9667fef1e64bc1962810ef88da6f6be0b08c33be2a005e886efddab206"} Apr 22 14:15:23.884861 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.884836 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zcvnd" event={"ID":"ebee87d9-f911-404c-9e1c-6e244d6b60cd","Type":"ContainerStarted","Data":"6f89356112c19cf344b3970f603fe25f97ba1648b48b73ee01ef3c6020c4f904"} Apr 22 14:15:23.909287 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.909267 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:23.918375 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.918314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" event={"ID":"cdaf48d9-50eb-4523-bd5a-3de107220028","Type":"ContainerStarted","Data":"f51e989ab7272eb8fe5585baf4b9bc6267259cfcdd65a10d417752221f67b8ec"} Apr 22 14:15:23.938827 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.938802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dtg5l" event={"ID":"87f925f7-d447-4a1f-b742-10a72c9ef6a9","Type":"ContainerStarted","Data":"5c18e18b700db6a994a17cb4afdee9a53600b6726938c325c914f3bb0040e88d"} Apr 22 14:15:23.954870 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:23.954844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"bcf52bb82b5cdd7ddbd8fb89a55480fde687b58892cf7fcdb111ecf2e90ea050"} Apr 22 14:15:24.228005 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:24.227930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:24.228157 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.228087 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:24.228157 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.228141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:26.22812338 +0000 UTC m=+5.075631347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:24.330493 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:24.330457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:24.330646 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.330610 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:24.330646 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.330630 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:24.330646 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.330642 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:24.330795 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.330694 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:26.330675889 +0000 UTC m=+5.178183851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:24.664251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:24.664062 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:22 +0000 UTC" deadline="2028-02-02 03:13:50.393968068 +0000 UTC" Apr 22 14:15:24.664251 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:24.664098 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15612h58m25.72987438s" Apr 22 14:15:24.743907 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:24.743235 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:24.743907 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:24.743367 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:25.746270 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.746240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:25.746687 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:25.746363 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:25.763498 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.762608 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8pz7t"] Apr 22 14:15:25.766045 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.766023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.766158 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:25.766100 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:25.842853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.842818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.842996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.842899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-kubelet-config\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.842996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.842925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-dbus\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.943656 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.943620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-kubelet-config\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.943797 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.943670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-dbus\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.943797 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.943734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.943899 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:25.943880 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:25.943961 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:25.943950 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:26.443929279 +0000 UTC m=+5.291437242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:25.944250 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.944230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-kubelet-config\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:25.944385 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:25.944368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75ed979a-756f-4aa8-938c-caef257181c3-dbus\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:26.246905 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:26.246708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:26.246905 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.246829 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:26.246905 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.246894 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.246876736 +0000 UTC m=+9.094384720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:26.348644 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:26.348053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:26.348644 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.348226 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:26.348644 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.348245 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:26.348644 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.348258 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:26.348644 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.348312 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.348293422 +0000 UTC m=+9.195801394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:26.448980 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:26.448945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:26.449131 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.449087 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.449208 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.449158 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:27.44913856 +0000 UTC m=+6.296646544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.743284 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:26.743167 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:26.743447 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:26.743308 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:27.459052 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:27.459018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:27.459501 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:27.459157 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:27.459501 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:27.459232 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:29.459213885 +0000 UTC m=+8.306721850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:27.743732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:27.743656 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:27.743881 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:27.743797 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:27.743936 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:27.743843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:27.744079 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:27.744027 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:28.744145 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:28.744099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:28.744587 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:28.744255 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:29.472899 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:29.472867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:29.473160 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:29.473034 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:29.473160 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:29.473115 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.473092948 +0000 UTC m=+12.320600925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:29.744506 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:29.743977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:29.744506 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:29.744102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:29.744506 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:29.744266 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:29.744506 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:29.744365 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:30.279496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:30.279458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:30.279673 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.279638 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:30.279754 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.279699 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:38.279682268 +0000 UTC m=+17.127190233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:30.380874 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:30.380831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:30.381044 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.381029 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:30.381098 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.381050 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:30.381098 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.381062 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:30.381169 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.381128 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:38.38110943 +0000 UTC m=+17.228617398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:30.744255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:30.744168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:30.744405 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:30.744285 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:31.744143 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:31.744092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:31.744584 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:31.744171 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:31.744584 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:31.744315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:31.744732 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:31.744706 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:32.743266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:32.743231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:32.743435 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:32.743362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:33.503341 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:33.503301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:33.503749 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:33.503420 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:33.503749 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:33.503486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:41.503470678 +0000 UTC m=+20.350978644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:33.744121 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:33.744086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:33.744308 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:33.744097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:33.744308 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:33.744233 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:33.744424 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:33.744309 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:34.744232 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:34.744198 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:34.744626 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:34.744324 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:35.743381 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:35.743346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:35.743531 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:35.743389 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:35.743531 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:35.743477 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:35.743601 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:35.743572 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:36.743538 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:36.743506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:36.743922 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:36.743611 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:37.743887 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:37.743859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:37.744319 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:37.743856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:37.744319 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:37.743980 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:37.744319 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:37.744037 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:38.339620 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:38.339583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:38.339808 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.339730 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:38.339808 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.339803 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.339781318 +0000 UTC m=+33.187289295 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:38.440208 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:38.440167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:38.440379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.440298 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:38.440379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.440318 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:38.440379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.440331 2577 projected.go:194] Error preparing data for projected volume kube-api-access-xv5jv for pod openshift-network-diagnostics/network-check-target-dg6v9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:38.440498 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.440384 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv podName:9b411dff-3568-43e1-813c-c4ebd140399b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.440370835 +0000 UTC m=+33.287878800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xv5jv" (UniqueName: "kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv") pod "network-check-target-dg6v9" (UID: "9b411dff-3568-43e1-813c-c4ebd140399b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:38.743343 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:38.743279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:38.743471 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:38.743379 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:39.743283 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:39.743246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:39.743732 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:39.743354 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:39.743732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:39.743417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:39.743732 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:39.743518 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:40.744401 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:40.744171 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:40.744800 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:40.744446 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:40.996897 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:40.996648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" event={"ID":"ab777a15bb250a55fa506fcc23a847b8","Type":"ContainerStarted","Data":"004d6b94a4ebe8ef5f1d7219e36835b1c1bd87f835fe2697dbfb8d7e062984f9"} Apr 22 14:15:40.997940 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:40.997919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bs9cg" event={"ID":"dbe24998-6780-4517-aeb1-716266573102","Type":"ContainerStarted","Data":"bce4a9c12addd63c8e175a3f8d36408fce7425e3be5a626172191bea6cda4fb6"} Apr 22 14:15:41.003256 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.003233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" event={"ID":"77146340-5d2a-4222-813e-ac3db16a7bcc","Type":"ContainerStarted","Data":"f554b0d5d10ac2fa0957a150d7726f08238c2a32e0b61094c241a456fefe3a9b"} Apr 22 14:15:41.012127 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.012002 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-75.ec2.internal" podStartSLOduration=19.011989633 podStartE2EDuration="19.011989633s" podCreationTimestamp="2026-04-22 14:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:41.011929707 +0000 UTC m=+19.859437692" watchObservedRunningTime="2026-04-22 14:15:41.011989633 +0000 UTC m=+19.859497616" Apr 22 14:15:41.014252 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.014142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"303f804c66b7014834cf22e19f16419ed42b0d63432288fecfb2f22d15d0e21a"} Apr 22 14:15:41.014252 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.014170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"af87d7880cc7050d8963b7c200ba9286ff8cb4026a8e48a6847ebd943bdd1e38"} Apr 22 14:15:41.029968 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.029901 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x5ds9" podStartSLOduration=2.862799757 podStartE2EDuration="20.02988554s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.992647921 +0000 UTC m=+1.840155881" lastFinishedPulling="2026-04-22 14:15:40.1597337 +0000 UTC m=+19.007241664" observedRunningTime="2026-04-22 14:15:41.028340197 +0000 UTC m=+19.875848192" watchObservedRunningTime="2026-04-22 14:15:41.02988554 +0000 UTC m=+19.877393524" Apr 22 14:15:41.564439 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.564363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:41.564561 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:41.564515 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:41.564595 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:41.564571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret podName:75ed979a-756f-4aa8-938c-caef257181c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.564558024 +0000 UTC m=+36.412065989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret") pod "global-pull-secret-syncer-8pz7t" (UID: "75ed979a-756f-4aa8-938c-caef257181c3") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:41.744109 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.744077 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:41.744284 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:41.744206 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:41.744284 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:41.744222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:41.744364 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:41.744316 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:42.016545 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.016482 2577 generic.go:358] "Generic (PLEG): container finished" podID="3c3613a9a7724d83b7f562a919909906" containerID="962c89a7c4c062ea624103cefd9dc7ed23b6a35d7ddc83e2435090f1752d3ad1" exitCode=0 Apr 22 14:15:42.017131 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.016550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" event={"ID":"3c3613a9a7724d83b7f562a919909906","Type":"ContainerDied","Data":"962c89a7c4c062ea624103cefd9dc7ed23b6a35d7ddc83e2435090f1752d3ad1"} Apr 22 14:15:42.017895 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.017873 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="7b9d2c1566eb1227ca7090c0a8a0bbb4f8899dfcb5e76438dea68e5e6259bcbf" exitCode=0 Apr 22 14:15:42.017978 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.017953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"7b9d2c1566eb1227ca7090c0a8a0bbb4f8899dfcb5e76438dea68e5e6259bcbf"} Apr 22 14:15:42.019156 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.019116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5hkxf" event={"ID":"4532406b-2b3c-4280-be31-a1a417b34d6c","Type":"ContainerStarted","Data":"910cc709cd033740a4e790684fc0f26c4b816529aace48385ed704043423124a"} Apr 22 14:15:42.020379 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.020359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf5kn" event={"ID":"c56bbf69-677d-48ac-9bdd-3f2234c4ebe1","Type":"ContainerStarted","Data":"9d7942df96df5d0d88b9e306406d89577f5cbfd2e171d65a92d345db73ba1865"} Apr 22 14:15:42.021536 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.021518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zcvnd" event={"ID":"ebee87d9-f911-404c-9e1c-6e244d6b60cd","Type":"ContainerStarted","Data":"c918622c850e1a09a436718b5f769ab421bf30e273f6b17606112c06a474c99f"} Apr 22 14:15:42.022650 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.022634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" event={"ID":"cdaf48d9-50eb-4523-bd5a-3de107220028","Type":"ContainerStarted","Data":"0d85cfd83800b2a61027e6519762440658658530a32caadc9149e644127ceeb7"} Apr 22 14:15:42.023751 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.023727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dtg5l" event={"ID":"87f925f7-d447-4a1f-b742-10a72c9ef6a9","Type":"ContainerStarted","Data":"9adc095f445cb7175b4ec8816ed92df86247721534a450a4eccb700366c6bab5"} Apr 22 14:15:42.026101 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.026045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"723ddd8e9bfdb657f71abf120e7360f60c1485d2c496cfe72498c717ee8f15ed"} Apr 22 14:15:42.026165 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.026107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"3fdd8f3b140118a5f763dc9dda78511503857c74639b570054ff55c98ca4cd1e"} Apr 22 14:15:42.026165 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.026121 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"24a75d7ed431682749666296f746ff10506751b9565be960ce9c1aa73319c3c8"} Apr 22 14:15:42.026165 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.026134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"045609b50595a1f38ba87346c15278e4263eff8fb6641d56256a0709fb302a8a"} Apr 22 14:15:42.037817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.037360 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bs9cg" podStartSLOduration=3.737782491 podStartE2EDuration="21.037343737s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:23.029648755 +0000 UTC m=+1.877156716" lastFinishedPulling="2026-04-22 14:15:40.329209987 +0000 UTC m=+19.176717962" observedRunningTime="2026-04-22 14:15:41.04619631 +0000 UTC m=+19.893704290" watchObservedRunningTime="2026-04-22 14:15:42.037343737 +0000 UTC m=+20.884851720" Apr 22 14:15:42.049777 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.049738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dtg5l" podStartSLOduration=3.8571471170000002 podStartE2EDuration="21.049725391s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.965730709 +0000 UTC m=+1.813238670" lastFinishedPulling="2026-04-22 14:15:40.158308968 +0000 UTC m=+19.005816944" observedRunningTime="2026-04-22 14:15:42.049548389 +0000 UTC m=+20.897056372" watchObservedRunningTime="2026-04-22 14:15:42.049725391 +0000 UTC m=+20.897233375" Apr 22 14:15:42.086015 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.085774 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zcvnd" podStartSLOduration=3.906489928 podStartE2EDuration="21.085756807s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.948048363 +0000 UTC m=+1.795556324" lastFinishedPulling="2026-04-22 14:15:40.127315224 +0000 UTC m=+18.974823203" observedRunningTime="2026-04-22 14:15:42.084812625 +0000 UTC m=+20.932320621" watchObservedRunningTime="2026-04-22 14:15:42.085756807 +0000 UTC m=+20.933264790" Apr 22 14:15:42.099697 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.099649 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zf5kn" podStartSLOduration=3.911243442 podStartE2EDuration="21.099634699s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.971961946 +0000 UTC m=+1.819469907" lastFinishedPulling="2026-04-22 14:15:40.160353189 +0000 UTC m=+19.007861164" observedRunningTime="2026-04-22 14:15:42.098760172 +0000 UTC m=+20.946268155" watchObservedRunningTime="2026-04-22 14:15:42.099634699 +0000 UTC m=+20.947142683" Apr 22 14:15:42.203157 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.203136 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:42.681474 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.681350 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:42.203153545Z","UUID":"e1707bf8-f69d-4973-aae5-3cab7b5cb62c","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:42.683466 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.683443 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:42.683587 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.683477 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:42.743915 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:42.743886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:42.744046 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:42.743995 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:43.029629 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.029601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" event={"ID":"cdaf48d9-50eb-4523-bd5a-3de107220028","Type":"ContainerStarted","Data":"f0174046d72d320257f3a14309ef643b645b5c31d6040576b414459b86b16fca"} Apr 22 14:15:43.031377 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.031343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" event={"ID":"3c3613a9a7724d83b7f562a919909906","Type":"ContainerStarted","Data":"f3e1269ab4d57ada424ec6bd908643971653475b9b46a2727a966b07fd35c0f6"} Apr 22 14:15:43.047343 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.047297 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5hkxf" podStartSLOduration=9.154818065 podStartE2EDuration="22.047280123s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:23.013548676 +0000 UTC m=+1.861056637" lastFinishedPulling="2026-04-22 14:15:35.906010717 +0000 UTC m=+14.753518695" observedRunningTime="2026-04-22 14:15:42.112665106 +0000 UTC m=+20.960173088" watchObservedRunningTime="2026-04-22 14:15:43.047280123 +0000 UTC m=+21.894788107" Apr 22 14:15:43.048017 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.047989 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-75.ec2.internal" podStartSLOduration=21.047981206 podStartE2EDuration="21.047981206s" podCreationTimestamp="2026-04-22 14:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:43.047158989 +0000 UTC m=+21.894666974" watchObservedRunningTime="2026-04-22 14:15:43.047981206 +0000 UTC m=+21.895489188" Apr 22 14:15:43.743681 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.743604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:43.743937 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:43.743604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:43.743937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:43.743697 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:43.743937 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:43.743774 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:44.034305 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:44.034219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" event={"ID":"cdaf48d9-50eb-4523-bd5a-3de107220028","Type":"ContainerStarted","Data":"f651b43c988c529ec5e32f35e480a36d4caf344da461fbeacfc324c29324cfbd"} Apr 22 14:15:44.037011 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:44.036987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"3fd3612d257fff481b4cd1209bda1557de2f4ce0914f9dfdb5349165418f0d81"} Apr 22 14:15:44.051782 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:44.051746 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n5lbr" podStartSLOduration=3.050276988 podStartE2EDuration="23.051736515s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.986705227 +0000 UTC m=+1.834213192" lastFinishedPulling="2026-04-22 14:15:42.988164752 +0000 UTC m=+21.835672719" observedRunningTime="2026-04-22 14:15:44.051329098 +0000 UTC m=+22.898837075" watchObservedRunningTime="2026-04-22 14:15:44.051736515 +0000 UTC m=+22.899244498" Apr 22 14:15:44.743938 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:44.743872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:44.744075 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:44.743972 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:45.743520 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:45.743303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:45.743965 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:45.743303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:45.743965 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:45.743542 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:45.743965 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:45.743594 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:46.043768 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.043684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" event={"ID":"e13fc5ca-d417-47c6-8b6c-63651dc87d31","Type":"ContainerStarted","Data":"84f2e57b81d6fb5cc755b2bc8e75db351e74f4761137d686951716e348013c90"} Apr 22 14:15:46.043992 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.043972 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:46.058060 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.058038 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:46.075923 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.075878 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" podStartSLOduration=7.204671977 podStartE2EDuration="25.075862105s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:22.934948103 +0000 UTC m=+1.782456064" lastFinishedPulling="2026-04-22 14:15:40.80613822 +0000 UTC m=+19.653646192" observedRunningTime="2026-04-22 14:15:46.074149815 +0000 UTC m=+24.921657799" watchObservedRunningTime="2026-04-22 14:15:46.075862105 +0000 UTC m=+24.923370089" Apr 22 14:15:46.744160 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.744128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:46.744525 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:46.744259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:46.870094 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.870065 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:46.930449 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.930403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:46.931108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:46.931052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:47.046263 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.046196 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:47.062070 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.062044 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:15:47.322977 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.322875 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sfm8m"] Apr 22 14:15:47.323129 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.323023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:47.323213 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:47.323158 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:47.325753 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.325722 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8pz7t"] Apr 22 14:15:47.325870 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.325842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:47.325958 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:47.325932 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:47.326541 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.326516 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dg6v9"] Apr 22 14:15:47.326646 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:47.326604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:47.326712 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:47.326681 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:48.744094 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:48.744024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:48.744094 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:48.744068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:48.744550 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:48.744142 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:48.744550 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:48.744208 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:49.050595 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:49.050518 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="ef94e7e58cc75e37812ec801219de71e11aac4e1f3a9cef69f20e4f2c8daf625" exitCode=0 Apr 22 14:15:49.050595 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:49.050580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"ef94e7e58cc75e37812ec801219de71e11aac4e1f3a9cef69f20e4f2c8daf625"} Apr 22 14:15:49.743842 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:49.743812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:49.743975 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:49.743940 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:50.054096 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:50.054023 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="c0b00cfba0836b0b47b4b2fe618bde9d98f09b56bae87fc0d24020a1e3440c70" exitCode=0 Apr 22 14:15:50.054096 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:50.054079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"c0b00cfba0836b0b47b4b2fe618bde9d98f09b56bae87fc0d24020a1e3440c70"} Apr 22 14:15:50.743823 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:50.743795 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:50.743952 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:50.743805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:50.743952 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:50.743934 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:50.744066 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:50.744033 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:51.058011 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.057978 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="acef03004d470805eb0d7026ecd2b5863ce374b9a5df24bd86a3bcccfb312e0c" exitCode=0 Apr 22 14:15:51.058333 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.058040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"acef03004d470805eb0d7026ecd2b5863ce374b9a5df24bd86a3bcccfb312e0c"} Apr 22 14:15:51.479796 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.479765 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:51.479978 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.479929 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:51.480483 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.480462 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zcvnd" Apr 22 14:15:51.745369 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:51.745290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:51.745528 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:51.745397 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dg6v9" podUID="9b411dff-3568-43e1-813c-c4ebd140399b" Apr 22 14:15:52.743258 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:52.743229 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:52.743653 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:52.743239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:52.743653 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:52.743349 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8pz7t" podUID="75ed979a-756f-4aa8-938c-caef257181c3" Apr 22 14:15:52.743653 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:52.743409 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:15:53.450126 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.449957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-75.ec2.internal" event="NodeReady" Apr 22 14:15:53.450289 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.450247 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:53.496439 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.496411 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zt24h"] Apr 22 14:15:53.499958 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.499940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.502509 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.502487 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpr7"] Apr 22 14:15:53.504288 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.504268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:53.504395 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.504319 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:53.504395 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.504342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:15:53.505898 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.505880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:53.508925 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.508696 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:15:53.508925 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.508746 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:53.509064 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.508930 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:53.509064 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.508941 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:53.512467 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.512451 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zt24h"] Apr 22 14:15:53.515623 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.515600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpr7"] Apr 22 14:15:53.669872 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.669839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.670054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.669897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-config-volume\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.670054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.669919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmvg\" (UniqueName: \"kubernetes.io/projected/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-kube-api-access-vtmvg\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.670054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.669959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:53.670054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.669975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jp9c\" (UniqueName: \"kubernetes.io/projected/428e51f0-2daf-428e-8b5a-df5ee4eab661-kube-api-access-5jp9c\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:53.670232 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.670072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-tmp-dir\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.744083 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.743991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:53.748408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.747173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:53.748408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.747362 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-77xtv\"" Apr 22 14:15:53.748408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.747484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:53.770715 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-config-volume\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.770824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmvg\" (UniqueName: \"kubernetes.io/projected/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-kube-api-access-vtmvg\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.770824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:53.770824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jp9c\" (UniqueName: \"kubernetes.io/projected/428e51f0-2daf-428e-8b5a-df5ee4eab661-kube-api-access-5jp9c\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:53.770824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-tmp-dir\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.770997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.770834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.770997 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:53.770852 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:53.770997 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:53.770909 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.270890056 +0000 UTC m=+33.118398034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:15:53.770997 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:53.770932 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:53.770997 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:53.770975 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.270960191 +0000 UTC m=+33.118468177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:53.771252 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.771149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-tmp-dir\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.771288 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.771266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-config-volume\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.783433 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.783410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmvg\" (UniqueName: \"kubernetes.io/projected/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-kube-api-access-vtmvg\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:53.783584 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:53.783567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jp9c\" (UniqueName: \"kubernetes.io/projected/428e51f0-2daf-428e-8b5a-df5ee4eab661-kube-api-access-5jp9c\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:54.274084 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.274044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:54.274269 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.274216 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:54.274269 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.274242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:54.274395 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.274297 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.274277906 +0000 UTC m=+34.121785883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:15:54.274395 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.274351 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:54.274395 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.274394 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.274383721 +0000 UTC m=+34.121891682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:54.375459 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.375422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:54.375614 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.375586 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:54.375727 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:54.375662 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:26.37564362 +0000 UTC m=+65.223151599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:54.476515 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.476481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:54.479303 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.479277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5jv\" (UniqueName: \"kubernetes.io/projected/9b411dff-3568-43e1-813c-c4ebd140399b-kube-api-access-xv5jv\") pod \"network-check-target-dg6v9\" (UID: \"9b411dff-3568-43e1-813c-c4ebd140399b\") " pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:54.656486 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.656449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:15:54.744104 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.744071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:54.744570 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.744075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:15:54.747070 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.747034 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:54.747203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.747124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:15:54.747203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:54.747161 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:15:55.281886 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:55.281852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:55.282076 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:55.281931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:55.282076 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:55.282035 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:55.282206 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:55.282114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.282100022 +0000 UTC m=+36.129607983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:55.282206 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:55.282037 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:55.282315 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:55.282209 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.282192435 +0000 UTC m=+36.129700414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:15:56.570812 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:56.570788 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dg6v9"] Apr 22 14:15:56.574823 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:56.574800 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b411dff_3568_43e1_813c_c4ebd140399b.slice/crio-ff202df0428d184e6f27d33e81f56d99a5fd93e0adae75537a65634129d7e9cf WatchSource:0}: Error finding container ff202df0428d184e6f27d33e81f56d99a5fd93e0adae75537a65634129d7e9cf: Status 404 returned error can't find the container with id ff202df0428d184e6f27d33e81f56d99a5fd93e0adae75537a65634129d7e9cf Apr 22 14:15:57.071712 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.071400 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="7bc56febe5d3faf66195de545508b16f1e15e664eff0f6f4c36d124c453288d9" exitCode=0 Apr 22 14:15:57.071712 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.071464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"7bc56febe5d3faf66195de545508b16f1e15e664eff0f6f4c36d124c453288d9"} Apr 22 14:15:57.072925 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.072902 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dg6v9" event={"ID":"9b411dff-3568-43e1-813c-c4ebd140399b","Type":"ContainerStarted","Data":"ff202df0428d184e6f27d33e81f56d99a5fd93e0adae75537a65634129d7e9cf"} Apr 22 14:15:57.296548 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.296514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:15:57.296687 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.296605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:15:57.296742 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:57.296704 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:57.296796 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:57.296754 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:57.296796 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:57.296765 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.296747396 +0000 UTC m=+40.144255370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:15:57.296796 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:15:57.296791 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.296781254 +0000 UTC m=+40.144289225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:57.598927 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.598896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:57.602322 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.602299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75ed979a-756f-4aa8-938c-caef257181c3-original-pull-secret\") pod \"global-pull-secret-syncer-8pz7t\" (UID: \"75ed979a-756f-4aa8-938c-caef257181c3\") " pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:57.755000 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.754968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8pz7t" Apr 22 14:15:57.878318 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:57.878249 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8pz7t"] Apr 22 14:15:57.881743 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:15:57.881716 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75ed979a_756f_4aa8_938c_caef257181c3.slice/crio-7ecb91ec8cf59ff53206edc1bde230d09e97aebbec6d7584d04f88cec12ce6f5 WatchSource:0}: Error finding container 7ecb91ec8cf59ff53206edc1bde230d09e97aebbec6d7584d04f88cec12ce6f5: Status 404 returned error can't find the container with id 7ecb91ec8cf59ff53206edc1bde230d09e97aebbec6d7584d04f88cec12ce6f5 Apr 22 14:15:58.076779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:58.076741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8pz7t" event={"ID":"75ed979a-756f-4aa8-938c-caef257181c3","Type":"ContainerStarted","Data":"7ecb91ec8cf59ff53206edc1bde230d09e97aebbec6d7584d04f88cec12ce6f5"} Apr 22 14:15:58.079771 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:58.079744 2577 generic.go:358] "Generic (PLEG): container finished" podID="de6e4092-f486-48f9-b9c5-7b146b3d9c83" containerID="6cffed1f67c9832258e4e55d28d518dbafd1f12ee0397ff3c0a120849aca378d" exitCode=0 Apr 22 14:15:58.079893 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:58.079790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerDied","Data":"6cffed1f67c9832258e4e55d28d518dbafd1f12ee0397ff3c0a120849aca378d"} Apr 22 14:15:59.085975 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:59.085944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" event={"ID":"de6e4092-f486-48f9-b9c5-7b146b3d9c83","Type":"ContainerStarted","Data":"8e940b0d4a16bee6d4492949407ec2008573a2e1e92e7ba6829b6d8e58d33859"} Apr 22 14:15:59.120796 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:15:59.120753 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pq7wr" podStartSLOduration=4.578995847 podStartE2EDuration="38.120734495s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:23.025128889 +0000 UTC m=+1.872636850" lastFinishedPulling="2026-04-22 14:15:56.566867535 +0000 UTC m=+35.414375498" observedRunningTime="2026-04-22 14:15:59.11838636 +0000 UTC m=+37.965894383" watchObservedRunningTime="2026-04-22 14:15:59.120734495 +0000 UTC m=+37.968242479" Apr 22 14:16:00.088997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:00.088960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dg6v9" event={"ID":"9b411dff-3568-43e1-813c-c4ebd140399b","Type":"ContainerStarted","Data":"f14508393eb4643abead67a0e8a7f971b93d9191a368771982aae94bf886a1ac"} Apr 22 14:16:00.089396 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:00.089198 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:16:00.111940 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:00.111867 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dg6v9" podStartSLOduration=35.804030303 podStartE2EDuration="39.111847395s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.576811582 +0000 UTC m=+35.424319547" lastFinishedPulling="2026-04-22 14:15:59.884628662 +0000 UTC m=+38.732136639" observedRunningTime="2026-04-22 14:16:00.108881056 +0000 UTC m=+38.956389039" watchObservedRunningTime="2026-04-22 14:16:00.111847395 +0000 UTC m=+38.959355382" Apr 22 14:16:01.326952 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:01.326914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:16:01.327379 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:01.326981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:16:01.327379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:01.327036 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:01.327379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:01.327069 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:01.327379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:01.327134 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:09.327110738 +0000 UTC m=+48.174618720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:01.327379 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:01.327151 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:09.327144966 +0000 UTC m=+48.174652927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:16:03.095333 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:03.095296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8pz7t" event={"ID":"75ed979a-756f-4aa8-938c-caef257181c3","Type":"ContainerStarted","Data":"f6e2109ecebd7bcb215aa890c78c9e44398544e7d02774427e1afa4970cb8db2"} Apr 22 14:16:09.377505 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:09.377468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:16:09.377947 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:09.377539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:16:09.377947 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:09.377621 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:09.377947 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:09.377687 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.377670104 +0000 UTC m=+64.225178071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:16:09.377947 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:09.377633 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:09.377947 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:09.377766 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.377749956 +0000 UTC m=+64.225257923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:19.068223 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:19.068193 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6zgvp" Apr 22 14:16:19.101957 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:19.101914 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8pz7t" podStartSLOduration=49.89774588 podStartE2EDuration="54.101899765s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:57.883472079 +0000 UTC m=+36.730980040" lastFinishedPulling="2026-04-22 14:16:02.087625951 +0000 UTC m=+40.935133925" observedRunningTime="2026-04-22 14:16:03.11849676 +0000 UTC m=+41.966004741" watchObservedRunningTime="2026-04-22 14:16:19.101899765 +0000 UTC m=+57.949407763" Apr 22 14:16:25.382881 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:25.382824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:16:25.382881 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:25.382889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:16:25.383326 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:25.382976 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:25.383326 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:25.382989 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:25.383326 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:25.383037 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:57.383022813 +0000 UTC m=+96.230530774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:25.383326 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:25.383050 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:57.38304489 +0000 UTC m=+96.230552851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:16:26.389268 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:26.389228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:16:26.392115 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:26.392097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:26.399652 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:26.399632 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:26.399729 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:26.399687 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:30.399671736 +0000 UTC m=+129.247179697 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : secret "metrics-daemon-secret" not found Apr 22 14:16:31.094108 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:31.094076 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dg6v9" Apr 22 14:16:57.400355 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:57.400318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:16:57.400771 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:16:57.400376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:16:57.400771 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:57.400457 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:57.400771 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:57.400472 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:57.400771 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:57.400517 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert podName:428e51f0-2daf-428e-8b5a-df5ee4eab661 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:01.400501198 +0000 UTC m=+160.248009159 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert") pod "ingress-canary-jzpr7" (UID: "428e51f0-2daf-428e-8b5a-df5ee4eab661") : secret "canary-serving-cert" not found Apr 22 14:16:57.400771 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:16:57.400530 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls podName:a5aaec7d-088d-41df-9b1e-e0eb09629b1e nodeName:}" failed. No retries permitted until 2026-04-22 14:18:01.400524758 +0000 UTC m=+160.248032719 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls") pod "dns-default-zt24h" (UID: "a5aaec7d-088d-41df-9b1e-e0eb09629b1e") : secret "dns-default-metrics-tls" not found Apr 22 14:17:30.422758 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:30.422721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:17:30.423264 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:30.422873 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:30.423264 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:30.422943 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs podName:5e382d5b-073e-4cd5-adc4-f9741cc073d8 nodeName:}" failed. No retries permitted until 2026-04-22 14:19:32.422928321 +0000 UTC m=+251.270436287 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs") pod "network-metrics-daemon-sfm8m" (UID: "5e382d5b-073e-4cd5-adc4-f9741cc073d8") : secret "metrics-daemon-secret" not found Apr 22 14:17:41.120752 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.120721 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cl44f"] Apr 22 14:17:41.123399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.123384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.127076 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.127053 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 14:17:41.127214 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.127117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 14:17:41.127288 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.127207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pztgc\"" Apr 22 14:17:41.128367 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.128320 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.128367 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.128329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.132559 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.132543 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 14:17:41.137946 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.137927 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cl44f"] Apr 22 14:17:41.190861 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.190832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.191004 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.190866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-serving-cert\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.191004 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.190890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kk8n\" (UniqueName: \"kubernetes.io/projected/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-kube-api-access-8kk8n\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.191004 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.190991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-tmp\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.191099 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.191017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-service-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.191099 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.191061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-snapshots\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.229367 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.229331 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq"] Apr 22 14:17:41.232045 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.232029 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-88d969974-5t68d"] Apr 22 14:17:41.232203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.232170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.235224 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.235206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b"] Apr 22 14:17:41.235355 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.235340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.236149 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.236128 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.236639 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.236623 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 14:17:41.236721 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.236624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 14:17:41.236721 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.236676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-74896\"" Apr 22 14:17:41.236956 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.236941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.237883 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.237863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.240093 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240078 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.240348 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240328 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 14:17:41.240434 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240328 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 14:17:41.240434 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240330 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 14:17:41.240641 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240627 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.240704 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240642 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 14:17:41.240704 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240627 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jcb4j\"" Apr 22 14:17:41.240942 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.240923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 14:17:41.241035 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.241001 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-9fzgf\"" Apr 22 14:17:41.241035 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.241007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.241155 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.241127 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.241970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.241952 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 14:17:41.250599 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.250571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq"] Apr 22 14:17:41.251843 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.251817 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b"] Apr 22 14:17:41.254736 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.254714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-88d969974-5t68d"] Apr 22 14:17:41.291988 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.291964 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-default-certificate\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.292139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.291994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b18091-6470-4c4d-9813-07ff092aaa8b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.292139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-serving-cert\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kk8n\" (UniqueName: \"kubernetes.io/projected/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-kube-api-access-8kk8n\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292139 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnsqj\" (UniqueName: \"kubernetes.io/projected/383f416f-aae2-4ddf-82c2-ed791b0c8a02-kube-api-access-nnsqj\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-tmp\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-service-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-stats-auth\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.292398 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/acca7920-73c1-4c87-b10d-8087b0ef338e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-snapshots\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b18091-6470-4c4d-9813-07ff092aaa8b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8l52\" (UniqueName: \"kubernetes.io/projected/c9b18091-6470-4c4d-9813-07ff092aaa8b-kube-api-access-l8l52\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhz7\" (UniqueName: \"kubernetes.io/projected/acca7920-73c1-4c87-b10d-8087b0ef338e-kube-api-access-nrhz7\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.292655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-tmp\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292872 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-service-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.292909 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.292894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-snapshots\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.293035 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.293018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.295118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.295097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-serving-cert\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.299745 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.299724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kk8n\" (UniqueName: \"kubernetes.io/projected/63f7e5c1-d4ff-4d3c-ba1e-425c1585a851-kube-api-access-8kk8n\") pod \"insights-operator-585dfdc468-cl44f\" (UID: \"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851\") " pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.393050 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.392959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.393050 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.392999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.393050 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-stats-auth\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.393050 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/acca7920-73c1-4c87-b10d-8087b0ef338e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.393132 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.393134 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.393165 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.893144746 +0000 UTC m=+140.740652729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b18091-6470-4c4d-9813-07ff092aaa8b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.393223 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.893205186 +0000 UTC m=+140.740713163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : secret "router-metrics-certs-default" not found Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.393242 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.893233069 +0000 UTC m=+140.740741044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8l52\" (UniqueName: \"kubernetes.io/projected/c9b18091-6470-4c4d-9813-07ff092aaa8b-kube-api-access-l8l52\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhz7\" (UniqueName: \"kubernetes.io/projected/acca7920-73c1-4c87-b10d-8087b0ef338e-kube-api-access-nrhz7\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-default-certificate\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.393426 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b18091-6470-4c4d-9813-07ff092aaa8b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.393931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnsqj\" (UniqueName: \"kubernetes.io/projected/383f416f-aae2-4ddf-82c2-ed791b0c8a02-kube-api-access-nnsqj\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.393931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b18091-6470-4c4d-9813-07ff092aaa8b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.393931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.393822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/acca7920-73c1-4c87-b10d-8087b0ef338e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.395500 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.395481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b18091-6470-4c4d-9813-07ff092aaa8b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.396300 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.396274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-stats-auth\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.396391 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.396300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-default-certificate\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.404976 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.404958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8l52\" (UniqueName: \"kubernetes.io/projected/c9b18091-6470-4c4d-9813-07ff092aaa8b-kube-api-access-l8l52\") pod \"kube-storage-version-migrator-operator-6769c5d45-sw58b\" (UID: \"c9b18091-6470-4c4d-9813-07ff092aaa8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.408133 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.408112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnsqj\" (UniqueName: \"kubernetes.io/projected/383f416f-aae2-4ddf-82c2-ed791b0c8a02-kube-api-access-nnsqj\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.408795 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.408777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhz7\" (UniqueName: \"kubernetes.io/projected/acca7920-73c1-4c87-b10d-8087b0ef338e-kube-api-access-nrhz7\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.431482 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.431450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-cl44f" Apr 22 14:17:41.543503 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.543472 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cl44f"] Apr 22 14:17:41.546599 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:17:41.546575 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f7e5c1_d4ff_4d3c_ba1e_425c1585a851.slice/crio-63c3700795bc093a694876d0268d40d7d20f6021a5b1f97370cacfec368eefc1 WatchSource:0}: Error finding container 63c3700795bc093a694876d0268d40d7d20f6021a5b1f97370cacfec368eefc1: Status 404 returned error can't find the container with id 63c3700795bc093a694876d0268d40d7d20f6021a5b1f97370cacfec368eefc1 Apr 22 14:17:41.552658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.552641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" Apr 22 14:17:41.659838 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.659631 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b"] Apr 22 14:17:41.662029 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:17:41.662008 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b18091_6470_4c4d_9813_07ff092aaa8b.slice/crio-a8bd26f8b5e4bb06f1e731055e92f4f6e2b8bdc17d74a81d47f1e8513e5be47b WatchSource:0}: Error finding container a8bd26f8b5e4bb06f1e731055e92f4f6e2b8bdc17d74a81d47f1e8513e5be47b: Status 404 returned error can't find the container with id a8bd26f8b5e4bb06f1e731055e92f4f6e2b8bdc17d74a81d47f1e8513e5be47b Apr 22 14:17:41.896836 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.896800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.897009 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.896853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:41.897009 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:41.896913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:41.897009 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.896974 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:41.897009 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.896997 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:42.896973071 +0000 UTC m=+141.744481045 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:41.897245 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.897023 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:41.897245 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.897040 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:42.89702598 +0000 UTC m=+141.744533946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : secret "router-metrics-certs-default" not found Apr 22 14:17:41.897245 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:41.897065 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:42.897052716 +0000 UTC m=+141.744560691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:42.276395 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:42.276342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cl44f" event={"ID":"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851","Type":"ContainerStarted","Data":"63c3700795bc093a694876d0268d40d7d20f6021a5b1f97370cacfec368eefc1"} Apr 22 14:17:42.278084 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:42.278057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" event={"ID":"c9b18091-6470-4c4d-9813-07ff092aaa8b","Type":"ContainerStarted","Data":"a8bd26f8b5e4bb06f1e731055e92f4f6e2b8bdc17d74a81d47f1e8513e5be47b"} Apr 22 14:17:42.906949 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:42.906915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:42.906949 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:42.906956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:42.907166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:42.907028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:42.907166 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:42.907068 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:42.907166 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:42.907127 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:44.907107876 +0000 UTC m=+143.754615861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : secret "router-metrics-certs-default" not found Apr 22 14:17:42.907291 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:42.907192 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:42.907291 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:42.907218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:44.907203025 +0000 UTC m=+143.754711006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:42.907291 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:42.907238 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:44.907229485 +0000 UTC m=+143.754737450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:44.284100 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.284060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cl44f" event={"ID":"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851","Type":"ContainerStarted","Data":"16249b3c030daa12aecc844a023c861188e751ba54b00da8a5275db15aaa3f60"} Apr 22 14:17:44.285290 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.285266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" event={"ID":"c9b18091-6470-4c4d-9813-07ff092aaa8b","Type":"ContainerStarted","Data":"ea863b9c62fac62ce19b2e1bdf41d3808e0e5e67159c5f4e28221a2836c254c8"} Apr 22 14:17:44.306684 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.306620 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-cl44f" podStartSLOduration=1.324282181 podStartE2EDuration="3.306604143s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.548257292 +0000 UTC m=+140.395765253" lastFinishedPulling="2026-04-22 14:17:43.530579246 +0000 UTC m=+142.378087215" observedRunningTime="2026-04-22 14:17:44.305700087 +0000 UTC m=+143.153208070" watchObservedRunningTime="2026-04-22 14:17:44.306604143 +0000 UTC m=+143.154112128" Apr 22 14:17:44.326878 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.326834 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" podStartSLOduration=1.457295285 podStartE2EDuration="3.326822515s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.663833544 +0000 UTC m=+140.511341505" lastFinishedPulling="2026-04-22 14:17:43.533360773 +0000 UTC m=+142.380868735" observedRunningTime="2026-04-22 14:17:44.326265754 +0000 UTC m=+143.173773750" watchObservedRunningTime="2026-04-22 14:17:44.326822515 +0000 UTC m=+143.174330497" Apr 22 14:17:44.925819 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.925775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:44.925819 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.925823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:44.925867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:44.925936 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:48.925917864 +0000 UTC m=+147.773425838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:44.925975 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:44.925979 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:44.926029 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:48.926014979 +0000 UTC m=+147.773522941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:44.926081 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:44.926046 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:48.926037011 +0000 UTC m=+147.773544972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : secret "router-metrics-certs-default" not found Apr 22 14:17:45.265112 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.265029 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q"] Apr 22 14:17:45.268173 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.268153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" Apr 22 14:17:45.281814 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.281792 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 14:17:45.282509 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.282486 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:45.282619 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.282510 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-drfzv\"" Apr 22 14:17:45.289372 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.289350 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q"] Apr 22 14:17:45.328852 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.328818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6n4k\" (UniqueName: \"kubernetes.io/projected/4ff583ff-248a-4c95-b4a5-3026c7d2f5cc-kube-api-access-s6n4k\") pod \"migrator-74bb7799d9-zbv5q\" (UID: \"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" Apr 22 14:17:45.429662 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.429633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6n4k\" (UniqueName: \"kubernetes.io/projected/4ff583ff-248a-4c95-b4a5-3026c7d2f5cc-kube-api-access-s6n4k\") pod \"migrator-74bb7799d9-zbv5q\" (UID: \"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" Apr 22 14:17:45.440836 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.440813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6n4k\" (UniqueName: \"kubernetes.io/projected/4ff583ff-248a-4c95-b4a5-3026c7d2f5cc-kube-api-access-s6n4k\") pod \"migrator-74bb7799d9-zbv5q\" (UID: \"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" Apr 22 14:17:45.576614 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.576519 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" Apr 22 14:17:45.708769 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:45.708729 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q"] Apr 22 14:17:45.712280 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:17:45.712257 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff583ff_248a_4c95_b4a5_3026c7d2f5cc.slice/crio-2aefedc1428c9e0eb3bb2af7dd390a07fd68a40585f134887e89edc5ff6e41ae WatchSource:0}: Error finding container 2aefedc1428c9e0eb3bb2af7dd390a07fd68a40585f134887e89edc5ff6e41ae: Status 404 returned error can't find the container with id 2aefedc1428c9e0eb3bb2af7dd390a07fd68a40585f134887e89edc5ff6e41ae Apr 22 14:17:46.289390 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:46.289341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" event={"ID":"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc","Type":"ContainerStarted","Data":"2aefedc1428c9e0eb3bb2af7dd390a07fd68a40585f134887e89edc5ff6e41ae"} Apr 22 14:17:46.763579 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:46.763559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dtg5l_87f925f7-d447-4a1f-b742-10a72c9ef6a9/dns-node-resolver/0.log" Apr 22 14:17:47.293571 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.293536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" event={"ID":"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc","Type":"ContainerStarted","Data":"1e03f2a3865a4cbbce398aebb7c5d8507024ca37be723962c4a5e81a7bf1a74c"} Apr 22 14:17:47.293571 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.293570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" event={"ID":"4ff583ff-248a-4c95-b4a5-3026c7d2f5cc","Type":"ContainerStarted","Data":"21397575b251ac7b0094af2c79c79d634b855b1acf141916869d4b3f9a06306d"} Apr 22 14:17:47.368072 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.368019 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zbv5q" podStartSLOduration=1.370319527 podStartE2EDuration="2.367996819s" podCreationTimestamp="2026-04-22 14:17:45 +0000 UTC" firstStartedPulling="2026-04-22 14:17:45.716315278 +0000 UTC m=+144.563823242" lastFinishedPulling="2026-04-22 14:17:46.713992571 +0000 UTC m=+145.561500534" observedRunningTime="2026-04-22 14:17:47.367723016 +0000 UTC m=+146.215230998" watchObservedRunningTime="2026-04-22 14:17:47.367996819 +0000 UTC m=+146.215504801" Apr 22 14:17:47.382029 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.382005 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5hkxf_4532406b-2b3c-4280-be31-a1a417b34d6c/node-ca/0.log" Apr 22 14:17:47.387696 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.387658 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-p4rtj"] Apr 22 14:17:47.391030 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.391011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.394485 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.394470 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 14:17:47.394761 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.394744 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 14:17:47.394819 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.394744 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mjpfc\"" Apr 22 14:17:47.394965 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.394952 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 14:17:47.395564 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.395552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 14:17:47.413212 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.413171 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-p4rtj"] Apr 22 14:17:47.444550 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.444523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-cabundle\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.444666 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.444575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwz2\" (UniqueName: \"kubernetes.io/projected/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-kube-api-access-kzwz2\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.444732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.444663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-key\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.545865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.545792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-key\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.545865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.545842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-cabundle\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.546037 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.545881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwz2\" (UniqueName: \"kubernetes.io/projected/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-kube-api-access-kzwz2\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.546514 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.546497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-cabundle\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.548027 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.548008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-signing-key\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.563196 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.563165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwz2\" (UniqueName: \"kubernetes.io/projected/4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b-kube-api-access-kzwz2\") pod \"service-ca-865cb79987-p4rtj\" (UID: \"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b\") " pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.699062 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.699017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-p4rtj" Apr 22 14:17:47.810767 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:47.810683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-p4rtj"] Apr 22 14:17:47.812848 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:17:47.812822 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3999a9_a639_47b4_b7ad_f4e6a9fdf38b.slice/crio-12ac6c34a95f264af5fe9b23879fd69908436d0fd33fa85e0ec6649586259ad8 WatchSource:0}: Error finding container 12ac6c34a95f264af5fe9b23879fd69908436d0fd33fa85e0ec6649586259ad8: Status 404 returned error can't find the container with id 12ac6c34a95f264af5fe9b23879fd69908436d0fd33fa85e0ec6649586259ad8 Apr 22 14:17:48.297595 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:48.297561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-p4rtj" event={"ID":"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b","Type":"ContainerStarted","Data":"12ac6c34a95f264af5fe9b23879fd69908436d0fd33fa85e0ec6649586259ad8"} Apr 22 14:17:48.958411 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:48.958369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:48.958451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:48.958524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:48.958539 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:48.958614 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:56.958593808 +0000 UTC m=+155.806101782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:48.958633 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:56.958624058 +0000 UTC m=+155.806132026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:48.958658 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:48.958645 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:48.958954 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:48.958686 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs podName:383f416f-aae2-4ddf-82c2-ed791b0c8a02 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:56.958672084 +0000 UTC m=+155.806180045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs") pod "router-default-88d969974-5t68d" (UID: "383f416f-aae2-4ddf-82c2-ed791b0c8a02") : secret "router-metrics-certs-default" not found Apr 22 14:17:50.304390 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:50.304352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-p4rtj" event={"ID":"4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b","Type":"ContainerStarted","Data":"6339976fe2601fa92c7df8a9b3c3d7a1990c23195143db14d73ea8d3f0659707"} Apr 22 14:17:50.325923 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:50.325875 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-p4rtj" podStartSLOduration=1.846464395 podStartE2EDuration="3.325856149s" podCreationTimestamp="2026-04-22 14:17:47 +0000 UTC" firstStartedPulling="2026-04-22 14:17:47.814572004 +0000 UTC m=+146.662079968" lastFinishedPulling="2026-04-22 14:17:49.293963745 +0000 UTC m=+148.141471722" observedRunningTime="2026-04-22 14:17:50.324034786 +0000 UTC m=+149.171542768" watchObservedRunningTime="2026-04-22 14:17:50.325856149 +0000 UTC m=+149.173364132" Apr 22 14:17:56.512630 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:56.512585 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zt24h" podUID="a5aaec7d-088d-41df-9b1e-e0eb09629b1e" Apr 22 14:17:56.518708 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:56.518679 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jzpr7" podUID="428e51f0-2daf-428e-8b5a-df5ee4eab661" Apr 22 14:17:57.022591 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.022550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:57.022775 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.022610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:17:57.022775 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.022668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:57.022856 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:57.022765 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:57.022856 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:57.022836 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls podName:acca7920-73c1-4c87-b10d-8087b0ef338e nodeName:}" failed. No retries permitted until 2026-04-22 14:18:13.022821044 +0000 UTC m=+171.870329010 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-87cbq" (UID: "acca7920-73c1-4c87-b10d-8087b0ef338e") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:57.023301 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.023274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/383f416f-aae2-4ddf-82c2-ed791b0c8a02-service-ca-bundle\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:57.024915 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.024898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383f416f-aae2-4ddf-82c2-ed791b0c8a02-metrics-certs\") pod \"router-default-88d969974-5t68d\" (UID: \"383f416f-aae2-4ddf-82c2-ed791b0c8a02\") " pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:57.147438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.147357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:57.270365 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.270335 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-88d969974-5t68d"] Apr 22 14:17:57.273515 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:17:57.273489 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383f416f_aae2_4ddf_82c2_ed791b0c8a02.slice/crio-829ff7c0eddcaaf576261597221d29daf5cecec97f4eeef65a20933e117852d1 WatchSource:0}: Error finding container 829ff7c0eddcaaf576261597221d29daf5cecec97f4eeef65a20933e117852d1: Status 404 returned error can't find the container with id 829ff7c0eddcaaf576261597221d29daf5cecec97f4eeef65a20933e117852d1 Apr 22 14:17:57.322658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.322614 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zt24h" Apr 22 14:17:57.322658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.322616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-88d969974-5t68d" event={"ID":"383f416f-aae2-4ddf-82c2-ed791b0c8a02","Type":"ContainerStarted","Data":"829ff7c0eddcaaf576261597221d29daf5cecec97f4eeef65a20933e117852d1"} Apr 22 14:17:57.322658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:57.322647 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:17:57.760549 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:17:57.760517 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-sfm8m" podUID="5e382d5b-073e-4cd5-adc4-f9741cc073d8" Apr 22 14:17:58.326402 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:58.326370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-88d969974-5t68d" event={"ID":"383f416f-aae2-4ddf-82c2-ed791b0c8a02","Type":"ContainerStarted","Data":"281364d4940d0c20c61f47e09390954def9248494abb4479580e6202b76b5308"} Apr 22 14:17:58.348230 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:58.348170 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-88d969974-5t68d" podStartSLOduration=17.348155771 podStartE2EDuration="17.348155771s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:58.347825698 +0000 UTC m=+157.195333680" watchObservedRunningTime="2026-04-22 14:17:58.348155771 +0000 UTC m=+157.195663754" Apr 22 14:17:59.148173 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:59.148135 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:59.150629 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:59.150608 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:59.329436 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:59.329402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:17:59.330510 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:17:59.330488 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-88d969974-5t68d" Apr 22 14:18:01.457055 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.457021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:18:01.457515 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.457077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:18:01.459420 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.459398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5aaec7d-088d-41df-9b1e-e0eb09629b1e-metrics-tls\") pod \"dns-default-zt24h\" (UID: \"a5aaec7d-088d-41df-9b1e-e0eb09629b1e\") " pod="openshift-dns/dns-default-zt24h" Apr 22 14:18:01.459537 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.459498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428e51f0-2daf-428e-8b5a-df5ee4eab661-cert\") pod \"ingress-canary-jzpr7\" (UID: \"428e51f0-2daf-428e-8b5a-df5ee4eab661\") " pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:18:01.526343 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.526311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:18:01.527360 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.527344 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:18:01.534704 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.534686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zt24h" Apr 22 14:18:01.534778 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.534710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpr7" Apr 22 14:18:01.653950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.653877 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpr7"] Apr 22 14:18:01.656424 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:01.656399 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428e51f0_2daf_428e_8b5a_df5ee4eab661.slice/crio-021527c20274048f17bac811c3c174b4e1ef31f5650141e7acd540b7ac17cfa2 WatchSource:0}: Error finding container 021527c20274048f17bac811c3c174b4e1ef31f5650141e7acd540b7ac17cfa2: Status 404 returned error can't find the container with id 021527c20274048f17bac811c3c174b4e1ef31f5650141e7acd540b7ac17cfa2 Apr 22 14:18:01.671654 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:01.671634 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zt24h"] Apr 22 14:18:01.674405 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:01.674383 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5aaec7d_088d_41df_9b1e_e0eb09629b1e.slice/crio-9d29483cad3970f15c68f4dd3b18acfea48196eb6c844ab700a39ef065de364b WatchSource:0}: Error finding container 9d29483cad3970f15c68f4dd3b18acfea48196eb6c844ab700a39ef065de364b: Status 404 returned error can't find the container with id 9d29483cad3970f15c68f4dd3b18acfea48196eb6c844ab700a39ef065de364b Apr 22 14:18:02.338229 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:02.338187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jzpr7" event={"ID":"428e51f0-2daf-428e-8b5a-df5ee4eab661","Type":"ContainerStarted","Data":"021527c20274048f17bac811c3c174b4e1ef31f5650141e7acd540b7ac17cfa2"} Apr 22 14:18:02.339377 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:02.339339 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zt24h" event={"ID":"a5aaec7d-088d-41df-9b1e-e0eb09629b1e","Type":"ContainerStarted","Data":"9d29483cad3970f15c68f4dd3b18acfea48196eb6c844ab700a39ef065de364b"} Apr 22 14:18:04.346425 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.346390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zt24h" event={"ID":"a5aaec7d-088d-41df-9b1e-e0eb09629b1e","Type":"ContainerStarted","Data":"39f52a021238dcef993cdcf571108bcf7d89bee795280cb7f5f3b4aeebd9a0f3"} Apr 22 14:18:04.346425 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.346428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zt24h" event={"ID":"a5aaec7d-088d-41df-9b1e-e0eb09629b1e","Type":"ContainerStarted","Data":"af56024edcf52d46c8212723fd922e5adfa884cddcfa1adf125f1a93ae9b3570"} Apr 22 14:18:04.346872 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.346517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zt24h" Apr 22 14:18:04.347544 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.347525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jzpr7" event={"ID":"428e51f0-2daf-428e-8b5a-df5ee4eab661","Type":"ContainerStarted","Data":"d74f5832b5bb1be7d8c0a67b1158bf1096992ba5fe2ecf14ed73879f1b1e63c5"} Apr 22 14:18:04.369863 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.369822 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zt24h" podStartSLOduration=129.220241036 podStartE2EDuration="2m11.369811635s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:18:01.676832466 +0000 UTC m=+160.524340427" lastFinishedPulling="2026-04-22 14:18:03.826403055 +0000 UTC m=+162.673911026" observedRunningTime="2026-04-22 14:18:04.36940485 +0000 UTC m=+163.216912835" watchObservedRunningTime="2026-04-22 14:18:04.369811635 +0000 UTC m=+163.217319618" Apr 22 14:18:04.388500 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:04.388465 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jzpr7" podStartSLOduration=129.215719814 podStartE2EDuration="2m11.388455717s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:18:01.658093372 +0000 UTC m=+160.505601333" lastFinishedPulling="2026-04-22 14:18:03.830829271 +0000 UTC m=+162.678337236" observedRunningTime="2026-04-22 14:18:04.388051225 +0000 UTC m=+163.235559208" watchObservedRunningTime="2026-04-22 14:18:04.388455717 +0000 UTC m=+163.235963699" Apr 22 14:18:10.987886 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:10.987854 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fqwqv"] Apr 22 14:18:10.990970 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:10.990954 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:10.994656 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:10.994634 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qqgh\"" Apr 22 14:18:10.995766 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:10.995747 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:10.996290 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:10.996274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:11.021686 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.021661 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fqwqv"] Apr 22 14:18:11.126740 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.126712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62386449-d154-47a4-b239-b0b6c68f2a85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.126906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.126752 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62386449-d154-47a4-b239-b0b6c68f2a85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.126906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.126771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62386449-d154-47a4-b239-b0b6c68f2a85-crio-socket\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.126906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.126847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276rz\" (UniqueName: \"kubernetes.io/projected/62386449-d154-47a4-b239-b0b6c68f2a85-kube-api-access-276rz\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.126906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.126881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62386449-d154-47a4-b239-b0b6c68f2a85-data-volume\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227744 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62386449-d154-47a4-b239-b0b6c68f2a85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227744 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62386449-d154-47a4-b239-b0b6c68f2a85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227744 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62386449-d154-47a4-b239-b0b6c68f2a85-crio-socket\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227948 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-276rz\" (UniqueName: \"kubernetes.io/projected/62386449-d154-47a4-b239-b0b6c68f2a85-kube-api-access-276rz\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227948 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62386449-d154-47a4-b239-b0b6c68f2a85-data-volume\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.227948 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.227884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62386449-d154-47a4-b239-b0b6c68f2a85-crio-socket\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.228190 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.228166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62386449-d154-47a4-b239-b0b6c68f2a85-data-volume\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.228749 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.228732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62386449-d154-47a4-b239-b0b6c68f2a85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.229980 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.229965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62386449-d154-47a4-b239-b0b6c68f2a85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.239898 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.239876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-276rz\" (UniqueName: \"kubernetes.io/projected/62386449-d154-47a4-b239-b0b6c68f2a85-kube-api-access-276rz\") pod \"insights-runtime-extractor-fqwqv\" (UID: \"62386449-d154-47a4-b239-b0b6c68f2a85\") " pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.299605 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.299577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fqwqv" Apr 22 14:18:11.421667 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.421643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fqwqv"] Apr 22 14:18:11.425057 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:11.425019 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62386449_d154_47a4_b239_b0b6c68f2a85.slice/crio-54d3b640fb447d86389165de11efa847f1373656f4e5c8281d48880b6a91988b WatchSource:0}: Error finding container 54d3b640fb447d86389165de11efa847f1373656f4e5c8281d48880b6a91988b: Status 404 returned error can't find the container with id 54d3b640fb447d86389165de11efa847f1373656f4e5c8281d48880b6a91988b Apr 22 14:18:11.746278 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:11.746194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:18:12.367604 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:12.367574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqwqv" event={"ID":"62386449-d154-47a4-b239-b0b6c68f2a85","Type":"ContainerStarted","Data":"ccbdcfe38d50721d9786af98baf09524fb381a2d52656488055f9773a3fd6224"} Apr 22 14:18:12.367940 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:12.367613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqwqv" event={"ID":"62386449-d154-47a4-b239-b0b6c68f2a85","Type":"ContainerStarted","Data":"bb525677c3b0f2ed8d0431c4aca008e6763a84dc44ffdb5df0f391110f0b0b86"} Apr 22 14:18:12.367940 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:12.367624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqwqv" event={"ID":"62386449-d154-47a4-b239-b0b6c68f2a85","Type":"ContainerStarted","Data":"54d3b640fb447d86389165de11efa847f1373656f4e5c8281d48880b6a91988b"} Apr 22 14:18:13.042862 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:13.042817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:18:13.045419 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:13.045388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/acca7920-73c1-4c87-b10d-8087b0ef338e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-87cbq\" (UID: \"acca7920-73c1-4c87-b10d-8087b0ef338e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:18:13.341966 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:13.341946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" Apr 22 14:18:13.462006 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:13.461975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq"] Apr 22 14:18:13.465593 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:13.465567 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacca7920_73c1_4c87_b10d_8087b0ef338e.slice/crio-beb022cdfe0c6fda5888f53632cacd3f4fafb56503d98aa08645f432a38fec52 WatchSource:0}: Error finding container beb022cdfe0c6fda5888f53632cacd3f4fafb56503d98aa08645f432a38fec52: Status 404 returned error can't find the container with id beb022cdfe0c6fda5888f53632cacd3f4fafb56503d98aa08645f432a38fec52 Apr 22 14:18:14.351586 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:14.351557 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zt24h" Apr 22 14:18:14.373950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:14.373914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fqwqv" event={"ID":"62386449-d154-47a4-b239-b0b6c68f2a85","Type":"ContainerStarted","Data":"abc3ccdca89d2a8245c9de5be387b1e09d9d0fff8e31d6a437df6c7edd477749"} Apr 22 14:18:14.374886 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:14.374864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" event={"ID":"acca7920-73c1-4c87-b10d-8087b0ef338e","Type":"ContainerStarted","Data":"beb022cdfe0c6fda5888f53632cacd3f4fafb56503d98aa08645f432a38fec52"} Apr 22 14:18:14.401294 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:14.401236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fqwqv" podStartSLOduration=2.538474362 podStartE2EDuration="4.401203593s" podCreationTimestamp="2026-04-22 14:18:10 +0000 UTC" firstStartedPulling="2026-04-22 14:18:11.47758559 +0000 UTC m=+170.325093551" lastFinishedPulling="2026-04-22 14:18:13.340314813 +0000 UTC m=+172.187822782" observedRunningTime="2026-04-22 14:18:14.398677496 +0000 UTC m=+173.246185479" watchObservedRunningTime="2026-04-22 14:18:14.401203593 +0000 UTC m=+173.248711573" Apr 22 14:18:15.378647 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.378606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" event={"ID":"acca7920-73c1-4c87-b10d-8087b0ef338e","Type":"ContainerStarted","Data":"43f976e857a9e927e567860d608ceef00e9ddb82144279811353fa2ab9625fdf"} Apr 22 14:18:15.406913 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.406867 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-87cbq" podStartSLOduration=32.812100693 podStartE2EDuration="34.406852282s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:18:13.467285146 +0000 UTC m=+172.314793107" lastFinishedPulling="2026-04-22 14:18:15.062036735 +0000 UTC m=+173.909544696" observedRunningTime="2026-04-22 14:18:15.406693658 +0000 UTC m=+174.254201640" watchObservedRunningTime="2026-04-22 14:18:15.406852282 +0000 UTC m=+174.254360265" Apr 22 14:18:15.556328 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.556292 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm"] Apr 22 14:18:15.559560 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.559542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:15.562780 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.562757 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 14:18:15.562878 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.562760 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ljxwf\"" Apr 22 14:18:15.569134 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.569116 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm"] Apr 22 14:18:15.662469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.662393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e355f284-c206-4ce8-ac5f-d48f46066e84-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gzhhm\" (UID: \"e355f284-c206-4ce8-ac5f-d48f46066e84\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:15.763637 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.763608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e355f284-c206-4ce8-ac5f-d48f46066e84-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gzhhm\" (UID: \"e355f284-c206-4ce8-ac5f-d48f46066e84\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:15.765941 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.765910 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e355f284-c206-4ce8-ac5f-d48f46066e84-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gzhhm\" (UID: \"e355f284-c206-4ce8-ac5f-d48f46066e84\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:15.868070 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.868038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:15.980141 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:15.980118 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm"] Apr 22 14:18:15.982486 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:15.982461 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode355f284_c206_4ce8_ac5f_d48f46066e84.slice/crio-117938ff5f8c0adbd6d4be8080edd985bde4b2519d0f83ca8ef4553407f164ab WatchSource:0}: Error finding container 117938ff5f8c0adbd6d4be8080edd985bde4b2519d0f83ca8ef4553407f164ab: Status 404 returned error can't find the container with id 117938ff5f8c0adbd6d4be8080edd985bde4b2519d0f83ca8ef4553407f164ab Apr 22 14:18:16.382291 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:16.382259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" event={"ID":"e355f284-c206-4ce8-ac5f-d48f46066e84","Type":"ContainerStarted","Data":"117938ff5f8c0adbd6d4be8080edd985bde4b2519d0f83ca8ef4553407f164ab"} Apr 22 14:18:17.389428 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:17.389390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" event={"ID":"e355f284-c206-4ce8-ac5f-d48f46066e84","Type":"ContainerStarted","Data":"2a7fe4023354a89d456b28a8aab52a77fa7a5d07b2efb4d1a96f71957d84436c"} Apr 22 14:18:17.389873 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:17.389588 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:17.393923 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:17.393900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" Apr 22 14:18:17.407027 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:17.406985 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gzhhm" podStartSLOduration=1.436331507 podStartE2EDuration="2.406973553s" podCreationTimestamp="2026-04-22 14:18:15 +0000 UTC" firstStartedPulling="2026-04-22 14:18:15.984362936 +0000 UTC m=+174.831870897" lastFinishedPulling="2026-04-22 14:18:16.955004978 +0000 UTC m=+175.802512943" observedRunningTime="2026-04-22 14:18:17.406281433 +0000 UTC m=+176.253789415" watchObservedRunningTime="2026-04-22 14:18:17.406973553 +0000 UTC m=+176.254481535" Apr 22 14:18:21.990105 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:21.990066 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw"] Apr 22 14:18:22.001639 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.001612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.004405 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.004380 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 14:18:22.004708 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.004389 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:18:22.005100 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.005082 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-4f458\"" Apr 22 14:18:22.006101 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.006079 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw"] Apr 22 14:18:22.006366 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.006340 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:22.036872 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.036844 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qw96h"] Apr 22 14:18:22.040616 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.040595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.043721 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.043702 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:22.043928 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.043908 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:22.044055 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.043704 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wnzx7\"" Apr 22 14:18:22.044166 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.043716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:22.111680 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cd73319-ed5e-4a45-b956-321dea56a78e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.111680 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-sys\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-wtmp\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-root\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-textfile\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.111924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-metrics-client-ca\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.112233 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-accelerators-collector-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.112233 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.112233 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.111993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fzpl\" (UniqueName: \"kubernetes.io/projected/8cd73319-ed5e-4a45-b956-321dea56a78e-kube-api-access-8fzpl\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.112233 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.112016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.112233 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.112042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxglc\" (UniqueName: \"kubernetes.io/projected/474382f8-8953-42c3-ad82-82234aea8a10-kube-api-access-vxglc\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212754 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-accelerators-collector-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212754 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fzpl\" (UniqueName: \"kubernetes.io/projected/8cd73319-ed5e-4a45-b956-321dea56a78e-kube-api-access-8fzpl\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxglc\" (UniqueName: \"kubernetes.io/projected/474382f8-8953-42c3-ad82-82234aea8a10-kube-api-access-vxglc\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cd73319-ed5e-4a45-b956-321dea56a78e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-sys\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-wtmp\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:22.212941 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.212996 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.212997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:22.213011 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls podName:474382f8-8953-42c3-ad82-82234aea8a10 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:22.712991552 +0000 UTC m=+181.560499528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls") pod "node-exporter-qw96h" (UID: "474382f8-8953-42c3-ad82-82234aea8a10") : secret "node-exporter-tls" not found Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-root\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-textfile\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-metrics-client-ca\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-sys\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-root\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213448 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-accelerators-collector-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-wtmp\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-textfile\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.213781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cd73319-ed5e-4a45-b956-321dea56a78e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.213925 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.213830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/474382f8-8953-42c3-ad82-82234aea8a10-metrics-client-ca\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.215468 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.215442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.215781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.215756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.215922 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.215904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cd73319-ed5e-4a45-b956-321dea56a78e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.224559 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.224531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxglc\" (UniqueName: \"kubernetes.io/projected/474382f8-8953-42c3-ad82-82234aea8a10-kube-api-access-vxglc\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.224953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.224933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fzpl\" (UniqueName: \"kubernetes.io/projected/8cd73319-ed5e-4a45-b956-321dea56a78e-kube-api-access-8fzpl\") pod \"openshift-state-metrics-9d44df66c-m7nlw\" (UID: \"8cd73319-ed5e-4a45-b956-321dea56a78e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.314073 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.314001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" Apr 22 14:18:22.433189 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.433149 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw"] Apr 22 14:18:22.435118 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:22.435093 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd73319_ed5e_4a45_b956_321dea56a78e.slice/crio-64b9ae7709aed5eaadd03d8b0c478cc03013972de95b4dae06ea217e5d9808ed WatchSource:0}: Error finding container 64b9ae7709aed5eaadd03d8b0c478cc03013972de95b4dae06ea217e5d9808ed: Status 404 returned error can't find the container with id 64b9ae7709aed5eaadd03d8b0c478cc03013972de95b4dae06ea217e5d9808ed Apr 22 14:18:22.716577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.716542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.718834 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.718810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/474382f8-8953-42c3-ad82-82234aea8a10-node-exporter-tls\") pod \"node-exporter-qw96h\" (UID: \"474382f8-8953-42c3-ad82-82234aea8a10\") " pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:22.950571 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:22.950536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qw96h" Apr 22 14:18:23.054601 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.054524 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:23.060203 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.059611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.062557 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.062499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 14:18:23.063028 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063009 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 14:18:23.063244 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063229 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 14:18:23.063637 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 14:18:23.063745 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063722 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 14:18:23.063810 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 14:18:23.063984 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 14:18:23.064037 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.063996 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 14:18:23.064173 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.064160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 14:18:23.064356 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.064342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7hsj5\"" Apr 22 14:18:23.087362 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.087333 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:23.120307 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg5fw\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120469 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.120764 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.120693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.221951 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.221911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222122 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.221980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222122 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222122 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222122 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222122 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg5fw\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.222376 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.222337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.224160 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:23.223707 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle podName:265d28e2-13af-475a-be3a-1a2e193562ee nodeName:}" failed. No retries permitted until 2026-04-22 14:18:23.723685299 +0000 UTC m=+182.571193274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee") : configmap references non-existent config key: ca-bundle.crt Apr 22 14:18:23.224496 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.224472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.226266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.225928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.227063 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:23.227041 2577 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 14:18:23.227149 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:23.227122 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls podName:265d28e2-13af-475a-be3a-1a2e193562ee nodeName:}" failed. No retries permitted until 2026-04-22 14:18:23.727105612 +0000 UTC m=+182.574613580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee") : secret "alertmanager-main-tls" not found Apr 22 14:18:23.227349 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.227328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.228280 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.228237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.231037 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.230972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.231563 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.231538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.233450 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.233411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.234017 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.233995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.234410 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.234376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.235691 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.235672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.245407 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.245383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg5fw\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.408240 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.408207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qw96h" event={"ID":"474382f8-8953-42c3-ad82-82234aea8a10","Type":"ContainerStarted","Data":"6a2b99b325bddc849757e939028637700916cacc38a602db0ce3ef3df95fb0d8"} Apr 22 14:18:23.409847 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.409816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" event={"ID":"8cd73319-ed5e-4a45-b956-321dea56a78e","Type":"ContainerStarted","Data":"cb28026f2d6cddbf3f635cc829d73e4beea02fdbad49147eacea98bced02b3ba"} Apr 22 14:18:23.409847 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.409846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" event={"ID":"8cd73319-ed5e-4a45-b956-321dea56a78e","Type":"ContainerStarted","Data":"36eb3aef3b04174d52cc136d233849008dd828cdd8ca54eece377e1b061913af"} Apr 22 14:18:23.409980 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.409861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" event={"ID":"8cd73319-ed5e-4a45-b956-321dea56a78e","Type":"ContainerStarted","Data":"64b9ae7709aed5eaadd03d8b0c478cc03013972de95b4dae06ea217e5d9808ed"} Apr 22 14:18:23.727188 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.727100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.727330 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.727194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.728107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.728076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.729958 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.729937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:23.972381 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:23.972355 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:24.055869 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.055837 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d854f498b-wj7pn"] Apr 22 14:18:24.063194 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.063150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.068081 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068059 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 14:18:24.068259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1lrbs5f68e95j\"" Apr 22 14:18:24.068624 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 14:18:24.068714 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068680 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 14:18:24.068714 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068695 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-whbnh\"" Apr 22 14:18:24.068824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068759 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 14:18:24.068976 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.068961 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 14:18:24.076632 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.076613 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d854f498b-wj7pn"] Apr 22 14:18:24.103895 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.103874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:24.107111 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:24.107090 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod265d28e2_13af_475a_be3a_1a2e193562ee.slice/crio-1c73991d5b95718a61b4c49cf1aa6da4443280a2ba7896a32e1484e2890aeef4 WatchSource:0}: Error finding container 1c73991d5b95718a61b4c49cf1aa6da4443280a2ba7896a32e1484e2890aeef4: Status 404 returned error can't find the container with id 1c73991d5b95718a61b4c49cf1aa6da4443280a2ba7896a32e1484e2890aeef4 Apr 22 14:18:24.130477 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/706bb132-5603-433e-95cd-80baf8a1ae5d-metrics-client-ca\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130573 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-grpc-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130573 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxzk9\" (UniqueName: \"kubernetes.io/projected/706bb132-5603-433e-95cd-80baf8a1ae5d-kube-api-access-nxzk9\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130656 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130700 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130736 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130777 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.130820 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.130798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231717 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-grpc-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231717 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxzk9\" (UniqueName: \"kubernetes.io/projected/706bb132-5603-433e-95cd-80baf8a1ae5d-kube-api-access-nxzk9\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231912 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231912 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231912 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231912 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.231912 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.232152 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.231918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/706bb132-5603-433e-95cd-80baf8a1ae5d-metrics-client-ca\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.232596 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.232540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/706bb132-5603-433e-95cd-80baf8a1ae5d-metrics-client-ca\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.234661 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.234639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.234787 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.234766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-grpc-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.234921 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.234897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.235003 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.234899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.235144 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.235125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-tls\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.235204 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.235139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/706bb132-5603-433e-95cd-80baf8a1ae5d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.240035 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.240017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxzk9\" (UniqueName: \"kubernetes.io/projected/706bb132-5603-433e-95cd-80baf8a1ae5d-kube-api-access-nxzk9\") pod \"thanos-querier-5d854f498b-wj7pn\" (UID: \"706bb132-5603-433e-95cd-80baf8a1ae5d\") " pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.372547 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.372510 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:24.415086 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.415051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" event={"ID":"8cd73319-ed5e-4a45-b956-321dea56a78e","Type":"ContainerStarted","Data":"786828b1d3a9aa3f9f28e0c04fe89799a7b11a400879ee04c9fd2df249e85ace"} Apr 22 14:18:24.416624 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.416542 2577 generic.go:358] "Generic (PLEG): container finished" podID="474382f8-8953-42c3-ad82-82234aea8a10" containerID="467a944e53120a0329c548abb6ffbf2980e2a573faf632a5aaedb439ba80d542" exitCode=0 Apr 22 14:18:24.416624 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.416598 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qw96h" event={"ID":"474382f8-8953-42c3-ad82-82234aea8a10","Type":"ContainerDied","Data":"467a944e53120a0329c548abb6ffbf2980e2a573faf632a5aaedb439ba80d542"} Apr 22 14:18:24.418366 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.418321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"1c73991d5b95718a61b4c49cf1aa6da4443280a2ba7896a32e1484e2890aeef4"} Apr 22 14:18:24.469493 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.469441 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7nlw" podStartSLOduration=2.572130386 podStartE2EDuration="3.469420346s" podCreationTimestamp="2026-04-22 14:18:21 +0000 UTC" firstStartedPulling="2026-04-22 14:18:22.552712149 +0000 UTC m=+181.400220114" lastFinishedPulling="2026-04-22 14:18:23.450002099 +0000 UTC m=+182.297510074" observedRunningTime="2026-04-22 14:18:24.440390516 +0000 UTC m=+183.287898515" watchObservedRunningTime="2026-04-22 14:18:24.469420346 +0000 UTC m=+183.316928330" Apr 22 14:18:24.532161 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.532135 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d854f498b-wj7pn"] Apr 22 14:18:24.534599 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:24.534578 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706bb132_5603_433e_95cd_80baf8a1ae5d.slice/crio-1f9a08d369f976b5d33b49af5cd8b44abe45236cdb617b6aa71895439e42b04d WatchSource:0}: Error finding container 1f9a08d369f976b5d33b49af5cd8b44abe45236cdb617b6aa71895439e42b04d: Status 404 returned error can't find the container with id 1f9a08d369f976b5d33b49af5cd8b44abe45236cdb617b6aa71895439e42b04d Apr 22 14:18:24.564767 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.564743 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-s2tpm"] Apr 22 14:18:24.569673 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.569653 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:24.573083 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.573062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-7zg5w\"" Apr 22 14:18:24.573934 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.573911 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:18:24.574019 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.573941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:18:24.581246 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.581223 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s2tpm"] Apr 22 14:18:24.637614 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.637575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77knx\" (UniqueName: \"kubernetes.io/projected/f0b2d52f-ab32-4412-9351-228c6d681e29-kube-api-access-77knx\") pod \"downloads-6bcc868b7-s2tpm\" (UID: \"f0b2d52f-ab32-4412-9351-228c6d681e29\") " pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:24.738779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.738733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77knx\" (UniqueName: \"kubernetes.io/projected/f0b2d52f-ab32-4412-9351-228c6d681e29-kube-api-access-77knx\") pod \"downloads-6bcc868b7-s2tpm\" (UID: \"f0b2d52f-ab32-4412-9351-228c6d681e29\") " pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:24.748809 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.748743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77knx\" (UniqueName: \"kubernetes.io/projected/f0b2d52f-ab32-4412-9351-228c6d681e29-kube-api-access-77knx\") pod \"downloads-6bcc868b7-s2tpm\" (UID: \"f0b2d52f-ab32-4412-9351-228c6d681e29\") " pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:24.879803 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:24.879757 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:25.146321 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.146247 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s2tpm"] Apr 22 14:18:25.151107 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:25.151048 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0b2d52f_ab32_4412_9351_228c6d681e29.slice/crio-4a6dc5007e89a94b96c699fb567e2a8aec601b793042766401900f262c74703b WatchSource:0}: Error finding container 4a6dc5007e89a94b96c699fb567e2a8aec601b793042766401900f262c74703b: Status 404 returned error can't find the container with id 4a6dc5007e89a94b96c699fb567e2a8aec601b793042766401900f262c74703b Apr 22 14:18:25.422688 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.422650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s2tpm" event={"ID":"f0b2d52f-ab32-4412-9351-228c6d681e29","Type":"ContainerStarted","Data":"4a6dc5007e89a94b96c699fb567e2a8aec601b793042766401900f262c74703b"} Apr 22 14:18:25.424779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.424751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qw96h" event={"ID":"474382f8-8953-42c3-ad82-82234aea8a10","Type":"ContainerStarted","Data":"6c07e2bf4dd2944b89800459c0d3cb61b72daea220428454c7e4dc2cb5cf217a"} Apr 22 14:18:25.424889 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.424785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qw96h" event={"ID":"474382f8-8953-42c3-ad82-82234aea8a10","Type":"ContainerStarted","Data":"82036b4e39e80ec37a997fd1a7307826ac43f9934ab7ffde6a99afc27c8685f3"} Apr 22 14:18:25.426066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.426042 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333" exitCode=0 Apr 22 14:18:25.426221 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.426128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333"} Apr 22 14:18:25.427368 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.427345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"1f9a08d369f976b5d33b49af5cd8b44abe45236cdb617b6aa71895439e42b04d"} Apr 22 14:18:25.447468 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:25.447426 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qw96h" podStartSLOduration=2.597840092 podStartE2EDuration="3.447411757s" podCreationTimestamp="2026-04-22 14:18:22 +0000 UTC" firstStartedPulling="2026-04-22 14:18:22.964007752 +0000 UTC m=+181.811515921" lastFinishedPulling="2026-04-22 14:18:23.813579625 +0000 UTC m=+182.661087586" observedRunningTime="2026-04-22 14:18:25.445938898 +0000 UTC m=+184.293446947" watchObservedRunningTime="2026-04-22 14:18:25.447411757 +0000 UTC m=+184.294919739" Apr 22 14:18:26.788005 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.787967 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5"] Apr 22 14:18:26.791655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.791629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:26.795813 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.795602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 14:18:26.796584 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.796402 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-h2v97\"" Apr 22 14:18:26.817751 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.817716 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5"] Apr 22 14:18:26.963947 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:26.963912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d61d92ca-13d0-4e38-bff7-be18fc721d92-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4nfd5\" (UID: \"d61d92ca-13d0-4e38-bff7-be18fc721d92\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:27.064817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.064748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d61d92ca-13d0-4e38-bff7-be18fc721d92-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4nfd5\" (UID: \"d61d92ca-13d0-4e38-bff7-be18fc721d92\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:27.067292 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.067261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d61d92ca-13d0-4e38-bff7-be18fc721d92-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4nfd5\" (UID: \"d61d92ca-13d0-4e38-bff7-be18fc721d92\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:27.103444 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.103410 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:27.239171 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.239146 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:18:27.243029 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.242428 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.248054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.247228 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:18:27.248054 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.247337 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:18:27.248730 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.248707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nhxxm\"" Apr 22 14:18:27.248825 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.248744 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:18:27.249021 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.249007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:18:27.251830 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.250342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:18:27.256659 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.256635 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:18:27.277542 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.277497 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5"] Apr 22 14:18:27.282040 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:27.281826 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61d92ca_13d0_4e38_bff7_be18fc721d92.slice/crio-4f24bf6de5a9acf348fb2d3cb3695be8832ea8425612566de04dcb9805f12b89 WatchSource:0}: Error finding container 4f24bf6de5a9acf348fb2d3cb3695be8832ea8425612566de04dcb9805f12b89: Status 404 returned error can't find the container with id 4f24bf6de5a9acf348fb2d3cb3695be8832ea8425612566de04dcb9805f12b89 Apr 22 14:18:27.368346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.368346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.368346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.368346 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368340 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.368528 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tfc\" (UniqueName: \"kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.368528 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.368495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.436913 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.436854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" event={"ID":"d61d92ca-13d0-4e38-bff7-be18fc721d92","Type":"ContainerStarted","Data":"4f24bf6de5a9acf348fb2d3cb3695be8832ea8425612566de04dcb9805f12b89"} Apr 22 14:18:27.439964 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.439900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6"} Apr 22 14:18:27.439964 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.439931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7"} Apr 22 14:18:27.439964 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.439946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831"} Apr 22 14:18:27.441789 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.441727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"d811151a46675751186f0bd1cc7a6b032a87023e16f06cbc433bfa059d984651"} Apr 22 14:18:27.441789 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.441755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"4a50ae2532e99c29b714afa3e45d6959202217ebb40e877b174da4415be1f471"} Apr 22 14:18:27.441789 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.441769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"829fc297a9e646c61de46ab5aac836bdd591c9608239a618397e8e60d17468dc"} Apr 22 14:18:27.469578 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.469671 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77tfc\" (UniqueName: \"kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.469671 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.469761 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.469822 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.469822 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.469801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.471068 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.471022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.471167 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.471152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.471274 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.471216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.472219 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.472172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.472811 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.472776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.491207 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.491153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tfc\" (UniqueName: \"kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc\") pod \"console-756bfdf5f-2tm8t\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.560149 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.560118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:27.693777 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:27.693687 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:18:27.696462 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:18:27.696431 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c4ad2aa_ecf4_4dfa_bfcc_17a5e55fac56.slice/crio-251c8138bb1fe10887ee01289a996a759343cb4e2a4c34488b0de66ab0e55505 WatchSource:0}: Error finding container 251c8138bb1fe10887ee01289a996a759343cb4e2a4c34488b0de66ab0e55505: Status 404 returned error can't find the container with id 251c8138bb1fe10887ee01289a996a759343cb4e2a4c34488b0de66ab0e55505 Apr 22 14:18:28.337523 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.337356 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:28.344082 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.344061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.352753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.352974 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.353024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.352974 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.353249 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.353483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.353581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9xx5w\"" Apr 22 14:18:28.353945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.353703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 14:18:28.355107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.354571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 14:18:28.355107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.354799 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4r206nra9h7l5\"" Apr 22 14:18:28.355107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.354977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:18:28.356038 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.355570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 14:18:28.356038 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.355809 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 14:18:28.356038 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.356033 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 14:18:28.356312 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.356132 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 14:18:28.366143 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.366113 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:28.449118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.449044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3"} Apr 22 14:18:28.449118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.449081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de"} Apr 22 14:18:28.449118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.449097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerStarted","Data":"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008"} Apr 22 14:18:28.451901 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.451868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"113d15df87e9a488c1876b995153765ddbda061023fec2d65b56b9a85eb0fc95"} Apr 22 14:18:28.454308 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.453938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756bfdf5f-2tm8t" event={"ID":"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56","Type":"ContainerStarted","Data":"251c8138bb1fe10887ee01289a996a759343cb4e2a4c34488b0de66ab0e55505"} Apr 22 14:18:28.480356 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzfx\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480471 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480471 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480541 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480573 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480605 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480634 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480801 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480801 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.480801 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.480981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481012 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.481007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481316 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.481038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.481316 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.481111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.490871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.490300 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.3344960590000001 podStartE2EDuration="5.490285313s" podCreationTimestamp="2026-04-22 14:18:23 +0000 UTC" firstStartedPulling="2026-04-22 14:18:24.108847556 +0000 UTC m=+182.956355517" lastFinishedPulling="2026-04-22 14:18:28.264636806 +0000 UTC m=+187.112144771" observedRunningTime="2026-04-22 14:18:28.48392833 +0000 UTC m=+187.331436339" watchObservedRunningTime="2026-04-22 14:18:28.490285313 +0000 UTC m=+187.337793296" Apr 22 14:18:28.582104 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582104 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582104 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582682 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzfx\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582926 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582926 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.582926 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.582778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.585654 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.584352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.585654 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.584443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.585654 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.585126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.585929 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.585847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.588971 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.588346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.588971 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.588433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.588971 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.588577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.589169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.589031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.589870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.589904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.590018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.590055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.590340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.590575 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.590536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.592939 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.592877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.593554 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.593513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.594783 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.594760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.595039 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.595000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzfx\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx\") pod \"prometheus-k8s-0\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.665380 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.665271 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:28.956763 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:28.956574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:29.120789 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:29.120751 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5734f31b_1afd_4fc0_84e8_533af7ca0af6.slice/crio-conmon-5252046e1df30316c85bd9fac1bbab7de20041fd9c1946445930d912dd17533e.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:18:29.458942 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.458788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" event={"ID":"d61d92ca-13d0-4e38-bff7-be18fc721d92","Type":"ContainerStarted","Data":"9a6a3db6b0974ffbccc346911a6a31f6b8d88ad2a089582871269a98c98283bb"} Apr 22 14:18:29.459410 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.459372 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:29.463474 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.463434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"0f049f3387a932d795abba4d9c60668766b3037681d47743b96fe841af10cc75"} Apr 22 14:18:29.463623 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.463487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" event={"ID":"706bb132-5603-433e-95cd-80baf8a1ae5d","Type":"ContainerStarted","Data":"dbc758b9e9e7a7c9123ce94d8c47e6977996c804342cbef3a2b1d8a2964864ab"} Apr 22 14:18:29.463623 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.463584 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:29.465612 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.465554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" Apr 22 14:18:29.465706 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.465636 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="5252046e1df30316c85bd9fac1bbab7de20041fd9c1946445930d912dd17533e" exitCode=0 Apr 22 14:18:29.465816 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.465797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"5252046e1df30316c85bd9fac1bbab7de20041fd9c1946445930d912dd17533e"} Apr 22 14:18:29.465881 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.465824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"0358e9aaeea188e4b33414ae8b851c7aab361a49260e551d86f128be5df28a1e"} Apr 22 14:18:29.474909 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.474664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4nfd5" podStartSLOduration=1.899456853 podStartE2EDuration="3.474653607s" podCreationTimestamp="2026-04-22 14:18:26 +0000 UTC" firstStartedPulling="2026-04-22 14:18:27.283390616 +0000 UTC m=+186.130898580" lastFinishedPulling="2026-04-22 14:18:28.858587368 +0000 UTC m=+187.706095334" observedRunningTime="2026-04-22 14:18:29.474328013 +0000 UTC m=+188.321835997" watchObservedRunningTime="2026-04-22 14:18:29.474653607 +0000 UTC m=+188.322161591" Apr 22 14:18:29.569777 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:29.569698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" podStartSLOduration=1.843964698 podStartE2EDuration="5.569678455s" podCreationTimestamp="2026-04-22 14:18:24 +0000 UTC" firstStartedPulling="2026-04-22 14:18:24.536660192 +0000 UTC m=+183.384168153" lastFinishedPulling="2026-04-22 14:18:28.262373937 +0000 UTC m=+187.109881910" observedRunningTime="2026-04-22 14:18:29.566846451 +0000 UTC m=+188.414354436" watchObservedRunningTime="2026-04-22 14:18:29.569678455 +0000 UTC m=+188.417186439" Apr 22 14:18:31.475331 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:31.475286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756bfdf5f-2tm8t" event={"ID":"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56","Type":"ContainerStarted","Data":"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101"} Apr 22 14:18:31.497823 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:31.497767 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756bfdf5f-2tm8t" podStartSLOduration=1.403170038 podStartE2EDuration="4.497751994s" podCreationTimestamp="2026-04-22 14:18:27 +0000 UTC" firstStartedPulling="2026-04-22 14:18:27.698444035 +0000 UTC m=+186.545951996" lastFinishedPulling="2026-04-22 14:18:30.793025985 +0000 UTC m=+189.640533952" observedRunningTime="2026-04-22 14:18:31.496208737 +0000 UTC m=+190.343716720" watchObservedRunningTime="2026-04-22 14:18:31.497751994 +0000 UTC m=+190.345260000" Apr 22 14:18:33.486675 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:33.486646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"b9728c0e0ac5a16227f73a6bd17c35a5292d4aee7bf94f66106d8a6d91cd1f93"} Apr 22 14:18:34.493857 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.493820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"daa91764595fe06fbaa85e31e1ae80dbeb2ced4356aefb44c7212fb7901ac2d5"} Apr 22 14:18:34.493857 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.493861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"56a29689cce45540c217d4e1499d91b5aa8ce81a597f5d535619a8449de45628"} Apr 22 14:18:34.494298 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.493876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"be07e541bb3367dad849690ffda3510ae17af3c6665a2ea6e178c99e353c59f2"} Apr 22 14:18:34.494298 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.493888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"610a2695d4f4f5c55dcccdb59b2d4d45fdb1e16f5e7f89d4df0dbe519eb21b19"} Apr 22 14:18:34.494298 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.493900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerStarted","Data":"49d11125ac3006b23c1ca6b8bc11fd9957c738f3e94cf95478b94c036e867d92"} Apr 22 14:18:34.536745 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:34.535774 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.669155763 podStartE2EDuration="6.535756444s" podCreationTimestamp="2026-04-22 14:18:28 +0000 UTC" firstStartedPulling="2026-04-22 14:18:29.467481963 +0000 UTC m=+188.314989929" lastFinishedPulling="2026-04-22 14:18:33.334082635 +0000 UTC m=+192.181590610" observedRunningTime="2026-04-22 14:18:34.534169234 +0000 UTC m=+193.381677217" watchObservedRunningTime="2026-04-22 14:18:34.535756444 +0000 UTC m=+193.383264428" Apr 22 14:18:35.476833 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:35.476807 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d854f498b-wj7pn" Apr 22 14:18:37.561045 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:37.561011 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:37.561540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:37.561058 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:37.566403 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:37.566369 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:38.510839 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:38.510812 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:18:38.668138 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:38.668061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:42.520631 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:42.520585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s2tpm" event={"ID":"f0b2d52f-ab32-4412-9351-228c6d681e29","Type":"ContainerStarted","Data":"a4e4eb82b7ef13d3f736ef23cda802762f2e427ca8f5cf8545676003b42b7518"} Apr 22 14:18:42.521025 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:42.520720 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:42.535856 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:42.535827 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-s2tpm" Apr 22 14:18:42.539994 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:42.539953 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-s2tpm" podStartSLOduration=1.7799988629999999 podStartE2EDuration="18.539942491s" podCreationTimestamp="2026-04-22 14:18:24 +0000 UTC" firstStartedPulling="2026-04-22 14:18:25.153381137 +0000 UTC m=+184.000889113" lastFinishedPulling="2026-04-22 14:18:41.913324765 +0000 UTC m=+200.760832741" observedRunningTime="2026-04-22 14:18:42.538860278 +0000 UTC m=+201.386368263" watchObservedRunningTime="2026-04-22 14:18:42.539942491 +0000 UTC m=+201.387450475" Apr 22 14:18:49.255414 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:49.255379 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:18:59.101920 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:18:59.101889 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f7e5c1_d4ff_4d3c_ba1e_425c1585a851.slice/crio-conmon-16249b3c030daa12aecc844a023c861188e751ba54b00da8a5275db15aaa3f60.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:18:59.570573 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:59.570539 2577 generic.go:358] "Generic (PLEG): container finished" podID="63f7e5c1-d4ff-4d3c-ba1e-425c1585a851" containerID="16249b3c030daa12aecc844a023c861188e751ba54b00da8a5275db15aaa3f60" exitCode=0 Apr 22 14:18:59.570724 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:59.570579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cl44f" event={"ID":"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851","Type":"ContainerDied","Data":"16249b3c030daa12aecc844a023c861188e751ba54b00da8a5275db15aaa3f60"} Apr 22 14:18:59.570906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:18:59.570890 2577 scope.go:117] "RemoveContainer" containerID="16249b3c030daa12aecc844a023c861188e751ba54b00da8a5275db15aaa3f60" Apr 22 14:19:00.575014 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:00.574980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cl44f" event={"ID":"63f7e5c1-d4ff-4d3c-ba1e-425c1585a851","Type":"ContainerStarted","Data":"d350a5265d07bb0f8940a134e89dce9572c00d643505847c4b1528e1734743b6"} Apr 22 14:19:09.599808 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:09.599772 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9b18091-6470-4c4d-9813-07ff092aaa8b" containerID="ea863b9c62fac62ce19b2e1bdf41d3808e0e5e67159c5f4e28221a2836c254c8" exitCode=0 Apr 22 14:19:09.600318 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:09.599818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" event={"ID":"c9b18091-6470-4c4d-9813-07ff092aaa8b","Type":"ContainerDied","Data":"ea863b9c62fac62ce19b2e1bdf41d3808e0e5e67159c5f4e28221a2836c254c8"} Apr 22 14:19:09.600318 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:09.600141 2577 scope.go:117] "RemoveContainer" containerID="ea863b9c62fac62ce19b2e1bdf41d3808e0e5e67159c5f4e28221a2836c254c8" Apr 22 14:19:10.603940 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:10.603896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sw58b" event={"ID":"c9b18091-6470-4c4d-9813-07ff092aaa8b","Type":"ContainerStarted","Data":"3e82540de264651697122db4b661e46c4139b107114dd4d9459dfcb26795b83d"} Apr 22 14:19:14.277948 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.277882 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-756bfdf5f-2tm8t" podUID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" containerName="console" containerID="cri-o://2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101" gracePeriod=15 Apr 22 14:19:14.566558 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.566533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756bfdf5f-2tm8t_4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56/console/0.log" Apr 22 14:19:14.566683 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.566621 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:19:14.620299 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756bfdf5f-2tm8t_4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56/console/0.log" Apr 22 14:19:14.620464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620315 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" containerID="2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101" exitCode=2 Apr 22 14:19:14.620464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756bfdf5f-2tm8t" event={"ID":"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56","Type":"ContainerDied","Data":"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101"} Apr 22 14:19:14.620464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756bfdf5f-2tm8t" event={"ID":"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56","Type":"ContainerDied","Data":"251c8138bb1fe10887ee01289a996a759343cb4e2a4c34488b0de66ab0e55505"} Apr 22 14:19:14.620464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620379 2577 scope.go:117] "RemoveContainer" containerID="2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101" Apr 22 14:19:14.620464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.620396 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756bfdf5f-2tm8t" Apr 22 14:19:14.628953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.628920 2577 scope.go:117] "RemoveContainer" containerID="2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101" Apr 22 14:19:14.629265 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:14.629230 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101\": container with ID starting with 2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101 not found: ID does not exist" containerID="2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101" Apr 22 14:19:14.629368 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.629276 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101"} err="failed to get container status \"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101\": rpc error: code = NotFound desc = could not find container \"2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101\": container with ID starting with 2f9f38b7bdec200c84a9e892ea001d04074070c58d6d6e9657b0e152dca0d101 not found: ID does not exist" Apr 22 14:19:14.697755 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697713 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.697950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697773 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.697950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697829 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77tfc\" (UniqueName: \"kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.697950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697860 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.697950 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697917 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.698190 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.697986 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config\") pod \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\" (UID: \"4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56\") " Apr 22 14:19:14.703270 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.700479 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:14.703270 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.700745 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config" (OuterVolumeSpecName: "console-config") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:14.703270 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.701062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca" (OuterVolumeSpecName: "service-ca") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:14.703697 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.703669 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:14.703697 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.703689 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:14.704049 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.704020 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc" (OuterVolumeSpecName: "kube-api-access-77tfc") pod "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" (UID: "4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56"). InnerVolumeSpecName "kube-api-access-77tfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799669 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-77tfc\" (UniqueName: \"kubernetes.io/projected/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-kube-api-access-77tfc\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799709 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799726 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-service-ca\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799741 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-oauth-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799759 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-console-serving-cert\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.799853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.799774 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56-oauth-serving-cert\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:14.941273 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.941245 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:19:14.945943 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:14.945920 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-756bfdf5f-2tm8t"] Apr 22 14:19:15.747995 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:15.747964 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" path="/var/lib/kubelet/pods/4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56/volumes" Apr 22 14:19:28.665979 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:28.665948 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:28.680907 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:28.680878 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:29.677538 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:29.677510 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:32.457330 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:32.457296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:19:32.459420 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:32.459399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e382d5b-073e-4cd5-adc4-f9741cc073d8-metrics-certs\") pod \"network-metrics-daemon-sfm8m\" (UID: \"5e382d5b-073e-4cd5-adc4-f9741cc073d8\") " pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:19:32.749293 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:32.749218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:19:32.757294 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:32.757276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sfm8m" Apr 22 14:19:32.887633 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:32.887610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sfm8m"] Apr 22 14:19:32.888999 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:19:32.888977 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e382d5b_073e_4cd5_adc4_f9741cc073d8.slice/crio-725a635b3f571531fee6fb77b187133e0514e0c839edd01cef25ca30a68f47d5 WatchSource:0}: Error finding container 725a635b3f571531fee6fb77b187133e0514e0c839edd01cef25ca30a68f47d5: Status 404 returned error can't find the container with id 725a635b3f571531fee6fb77b187133e0514e0c839edd01cef25ca30a68f47d5 Apr 22 14:19:33.676531 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:33.676490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sfm8m" event={"ID":"5e382d5b-073e-4cd5-adc4-f9741cc073d8","Type":"ContainerStarted","Data":"725a635b3f571531fee6fb77b187133e0514e0c839edd01cef25ca30a68f47d5"} Apr 22 14:19:34.680549 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:34.680516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sfm8m" event={"ID":"5e382d5b-073e-4cd5-adc4-f9741cc073d8","Type":"ContainerStarted","Data":"aec5652855b8c460e0badcfa714596b3195a2602c70bc6b44a3f82207cca9688"} Apr 22 14:19:34.680549 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:34.680552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sfm8m" event={"ID":"5e382d5b-073e-4cd5-adc4-f9741cc073d8","Type":"ContainerStarted","Data":"d1594398220d9799df3d2d505a9c9fc1428d605cb4d8964db2b28fa43b63fb43"} Apr 22 14:19:34.699343 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:34.699291 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sfm8m" podStartSLOduration=252.788234013 podStartE2EDuration="4m13.699273796s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:19:32.890958881 +0000 UTC m=+251.738466844" lastFinishedPulling="2026-04-22 14:19:33.801998666 +0000 UTC m=+252.649506627" observedRunningTime="2026-04-22 14:19:34.698374675 +0000 UTC m=+253.545882659" watchObservedRunningTime="2026-04-22 14:19:34.699273796 +0000 UTC m=+253.546781783" Apr 22 14:19:42.410728 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.410645 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:42.411142 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411103 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="alertmanager" containerID="cri-o://ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" gracePeriod=120 Apr 22 14:19:42.411255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411100 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="prom-label-proxy" containerID="cri-o://e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" gracePeriod=120 Apr 22 14:19:42.411255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411137 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-metric" containerID="cri-o://1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" gracePeriod=120 Apr 22 14:19:42.411255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411161 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="config-reloader" containerID="cri-o://d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" gracePeriod=120 Apr 22 14:19:42.411255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411137 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-web" containerID="cri-o://0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" gracePeriod=120 Apr 22 14:19:42.411255 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.411118 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy" containerID="cri-o://7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" gracePeriod=120 Apr 22 14:19:42.709142 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709070 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" exitCode=0 Apr 22 14:19:42.709142 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709095 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" exitCode=0 Apr 22 14:19:42.709142 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709101 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" exitCode=0 Apr 22 14:19:42.709142 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709106 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" exitCode=0 Apr 22 14:19:42.709383 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3"} Apr 22 14:19:42.709383 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008"} Apr 22 14:19:42.709383 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7"} Apr 22 14:19:42.709383 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:42.709203 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831"} Apr 22 14:19:43.648563 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.648544 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:43.715112 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715033 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" exitCode=0 Apr 22 14:19:43.715112 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715059 2577 generic.go:358] "Generic (PLEG): container finished" podID="265d28e2-13af-475a-be3a-1a2e193562ee" containerID="0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" exitCode=0 Apr 22 14:19:43.715320 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de"} Apr 22 14:19:43.715320 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715148 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:43.715320 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6"} Apr 22 14:19:43.715320 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"265d28e2-13af-475a-be3a-1a2e193562ee","Type":"ContainerDied","Data":"1c73991d5b95718a61b4c49cf1aa6da4443280a2ba7896a32e1484e2890aeef4"} Apr 22 14:19:43.715320 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.715207 2577 scope.go:117] "RemoveContainer" containerID="e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" Apr 22 14:19:43.722091 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.722073 2577 scope.go:117] "RemoveContainer" containerID="1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" Apr 22 14:19:43.728636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.728614 2577 scope.go:117] "RemoveContainer" containerID="7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" Apr 22 14:19:43.734661 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.734645 2577 scope.go:117] "RemoveContainer" containerID="0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" Apr 22 14:19:43.740623 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.740606 2577 scope.go:117] "RemoveContainer" containerID="d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" Apr 22 14:19:43.746700 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.746677 2577 scope.go:117] "RemoveContainer" containerID="ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" Apr 22 14:19:43.752990 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.752974 2577 scope.go:117] "RemoveContainer" containerID="a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333" Apr 22 14:19:43.757042 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757024 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg5fw\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757127 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757053 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757127 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757075 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757127 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757113 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757297 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757135 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757297 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757164 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757297 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757214 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757297 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757239 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757297 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757290 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757335 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757365 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757418 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.757540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757465 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets\") pod \"265d28e2-13af-475a-be3a-1a2e193562ee\" (UID: \"265d28e2-13af-475a-be3a-1a2e193562ee\") " Apr 22 14:19:43.758138 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757837 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:43.758138 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.757856 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:19:43.758309 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.758206 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:43.759781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.759745 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw" (OuterVolumeSpecName: "kube-api-access-jg5fw") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "kube-api-access-jg5fw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:43.760088 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.760052 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.760339 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.760301 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.760494 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.760464 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out" (OuterVolumeSpecName: "config-out") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:19:43.760810 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.760770 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:43.761461 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.761032 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.761853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.761817 2577 scope.go:117] "RemoveContainer" containerID="e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" Apr 22 14:19:43.761853 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.761834 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.762027 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762004 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.762116 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.762073 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3\": container with ID starting with e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3 not found: ID does not exist" containerID="e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" Apr 22 14:19:43.762244 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762102 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3"} err="failed to get container status \"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3\": rpc error: code = NotFound desc = could not find container \"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3\": container with ID starting with e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3 not found: ID does not exist" Apr 22 14:19:43.762244 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762125 2577 scope.go:117] "RemoveContainer" containerID="1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" Apr 22 14:19:43.762482 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.762461 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de\": container with ID starting with 1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de not found: ID does not exist" containerID="1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" Apr 22 14:19:43.762551 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762492 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de"} err="failed to get container status \"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de\": rpc error: code = NotFound desc = could not find container \"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de\": container with ID starting with 1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de not found: ID does not exist" Apr 22 14:19:43.762551 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762514 2577 scope.go:117] "RemoveContainer" containerID="7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" Apr 22 14:19:43.762887 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.762853 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008\": container with ID starting with 7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008 not found: ID does not exist" containerID="7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" Apr 22 14:19:43.762961 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762886 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008"} err="failed to get container status \"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008\": rpc error: code = NotFound desc = could not find container \"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008\": container with ID starting with 7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008 not found: ID does not exist" Apr 22 14:19:43.762961 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.762907 2577 scope.go:117] "RemoveContainer" containerID="0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" Apr 22 14:19:43.763279 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.763259 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6\": container with ID starting with 0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6 not found: ID does not exist" containerID="0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" Apr 22 14:19:43.763369 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763286 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6"} err="failed to get container status \"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6\": rpc error: code = NotFound desc = could not find container \"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6\": container with ID starting with 0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6 not found: ID does not exist" Apr 22 14:19:43.763369 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763304 2577 scope.go:117] "RemoveContainer" containerID="d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" Apr 22 14:19:43.763562 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.763546 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7\": container with ID starting with d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7 not found: ID does not exist" containerID="d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" Apr 22 14:19:43.763644 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763567 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7"} err="failed to get container status \"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7\": rpc error: code = NotFound desc = could not find container \"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7\": container with ID starting with d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7 not found: ID does not exist" Apr 22 14:19:43.763644 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763585 2577 scope.go:117] "RemoveContainer" containerID="ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" Apr 22 14:19:43.763838 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.763822 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831\": container with ID starting with ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831 not found: ID does not exist" containerID="ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" Apr 22 14:19:43.763884 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763842 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831"} err="failed to get container status \"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831\": rpc error: code = NotFound desc = could not find container \"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831\": container with ID starting with ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831 not found: ID does not exist" Apr 22 14:19:43.763884 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.763856 2577 scope.go:117] "RemoveContainer" containerID="a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333" Apr 22 14:19:43.764096 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:19:43.764068 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333\": container with ID starting with a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333 not found: ID does not exist" containerID="a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333" Apr 22 14:19:43.764146 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764104 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333"} err="failed to get container status \"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333\": rpc error: code = NotFound desc = could not find container \"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333\": container with ID starting with a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333 not found: ID does not exist" Apr 22 14:19:43.764146 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764119 2577 scope.go:117] "RemoveContainer" containerID="e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3" Apr 22 14:19:43.764413 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764391 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3"} err="failed to get container status \"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3\": rpc error: code = NotFound desc = could not find container \"e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3\": container with ID starting with e7b7f9d3691c28c56eaf991c0e913ce9fabaca793aeb62fe408f3f4c934000d3 not found: ID does not exist" Apr 22 14:19:43.764485 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764414 2577 scope.go:117] "RemoveContainer" containerID="1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de" Apr 22 14:19:43.764683 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764664 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de"} err="failed to get container status \"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de\": rpc error: code = NotFound desc = could not find container \"1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de\": container with ID starting with 1a17ce384e7e5064bb2e164a815cb25722ffc9153a0957f602ae70d9d2a3b0de not found: ID does not exist" Apr 22 14:19:43.764754 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764684 2577 scope.go:117] "RemoveContainer" containerID="7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008" Apr 22 14:19:43.764953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764925 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008"} err="failed to get container status \"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008\": rpc error: code = NotFound desc = could not find container \"7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008\": container with ID starting with 7e9ef6522ba3084865ef9e0ea86bd54a28a67a3040db21f8b908f9925f510008 not found: ID does not exist" Apr 22 14:19:43.765031 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.764954 2577 scope.go:117] "RemoveContainer" containerID="0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6" Apr 22 14:19:43.765234 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765207 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6"} err="failed to get container status \"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6\": rpc error: code = NotFound desc = could not find container \"0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6\": container with ID starting with 0523445be5322fa8f8e597a6e4f4e345351e95f8b735ed3c2d5641b64f9aaec6 not found: ID does not exist" Apr 22 14:19:43.765316 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765236 2577 scope.go:117] "RemoveContainer" containerID="d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7" Apr 22 14:19:43.765464 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765444 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7"} err="failed to get container status \"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7\": rpc error: code = NotFound desc = could not find container \"d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7\": container with ID starting with d9749091fe3c418acf69db91018de8425d0ad97b7c35d064341f03cddc29afb7 not found: ID does not exist" Apr 22 14:19:43.765534 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765464 2577 scope.go:117] "RemoveContainer" containerID="ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831" Apr 22 14:19:43.765755 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765643 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831"} err="failed to get container status \"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831\": rpc error: code = NotFound desc = could not find container \"ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831\": container with ID starting with ba273c29ad3be8a18dbe3bd3451bd0d195407202a7d6949ffe5b2b394ef41831 not found: ID does not exist" Apr 22 14:19:43.765755 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765668 2577 scope.go:117] "RemoveContainer" containerID="a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333" Apr 22 14:19:43.766018 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.765880 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333"} err="failed to get container status \"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333\": rpc error: code = NotFound desc = could not find container \"a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333\": container with ID starting with a929f460f9173863ac1ac2cec4a6356bba665f905be31aaf122a1d6072beb333 not found: ID does not exist" Apr 22 14:19:43.766585 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.766563 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.772815 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.772795 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config" (OuterVolumeSpecName: "web-config") pod "265d28e2-13af-475a-be3a-1a2e193562ee" (UID: "265d28e2-13af-475a-be3a-1a2e193562ee"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:43.858658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858627 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-tls-assets\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858654 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jg5fw\" (UniqueName: \"kubernetes.io/projected/265d28e2-13af-475a-be3a-1a2e193562ee-kube-api-access-jg5fw\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858664 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-main-tls\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858673 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858683 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-config-out\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858691 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-cluster-tls-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858702 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858733 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/265d28e2-13af-475a-be3a-1a2e193562ee-alertmanager-main-db\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858745 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858753 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/265d28e2-13af-475a-be3a-1a2e193562ee-metrics-client-ca\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858761 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-web-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858771 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:43.858855 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:43.858779 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/265d28e2-13af-475a-be3a-1a2e193562ee-config-volume\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:44.038677 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.038646 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:44.044092 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.044069 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:44.077071 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077043 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:44.077365 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077352 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" containerName="console" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077367 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" containerName="console" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077380 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077386 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077402 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="alertmanager" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077410 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="alertmanager" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077419 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="prom-label-proxy" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077426 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="prom-label-proxy" Apr 22 14:19:44.077438 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077441 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="config-reloader" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077447 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="config-reloader" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077452 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-metric" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077457 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-metric" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077466 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="init-config-reloader" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077471 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="init-config-reloader" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077476 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-web" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077481 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-web" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077530 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-metric" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077539 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="alertmanager" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077546 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy-web" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077552 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="kube-rbac-proxy" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077559 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="prom-label-proxy" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077564 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c4ad2aa-ecf4-4dfa-bfcc-17a5e55fac56" containerName="console" Apr 22 14:19:44.077655 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.077571 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" containerName="config-reloader" Apr 22 14:19:44.082780 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.082759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.088795 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.088772 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 14:19:44.088953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.088812 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 14:19:44.088953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.088930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 14:19:44.088953 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.088940 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 14:19:44.089390 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.089352 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7hsj5\"" Apr 22 14:19:44.089488 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.089389 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 14:19:44.089488 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.089449 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 14:19:44.089597 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.089547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 14:19:44.089597 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.089569 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 14:19:44.096580 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.096560 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:44.097266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.097249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 14:19:44.160471 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160603 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160603 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160603 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160740 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160740 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160740 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrphv\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-kube-api-access-wrphv\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160863 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160863 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-config-out\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.160863 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.161005 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.161005 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-web-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.161005 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.160953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261720 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-web-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261880 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261880 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261880 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261880 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.261880 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261855 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrphv\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-kube-api-access-wrphv\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.261992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-config-out\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.262028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262119 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.262068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.262601 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.262305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.263072 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.262677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.263742 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.263702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/333a0be8-4265-4134-bbe8-011b571fbe9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.264792 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.264766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265228 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265315 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265315 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265442 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265488 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265662 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/333a0be8-4265-4134-bbe8-011b571fbe9e-config-out\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.265776 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.265763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-web-config\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.266364 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.266348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/333a0be8-4265-4134-bbe8-011b571fbe9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.271987 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.271971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrphv\" (UniqueName: \"kubernetes.io/projected/333a0be8-4265-4134-bbe8-011b571fbe9e-kube-api-access-wrphv\") pod \"alertmanager-main-0\" (UID: \"333a0be8-4265-4134-bbe8-011b571fbe9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.391825 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.391801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:19:44.529315 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.529292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:19:44.531677 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:19:44.531654 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod333a0be8_4265_4134_bbe8_011b571fbe9e.slice/crio-136fda21814f548175e7cad26d5c86f5d06cab14a5651b2715a1516ca3d6d84a WatchSource:0}: Error finding container 136fda21814f548175e7cad26d5c86f5d06cab14a5651b2715a1516ca3d6d84a: Status 404 returned error can't find the container with id 136fda21814f548175e7cad26d5c86f5d06cab14a5651b2715a1516ca3d6d84a Apr 22 14:19:44.719806 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.719722 2577 generic.go:358] "Generic (PLEG): container finished" podID="333a0be8-4265-4134-bbe8-011b571fbe9e" containerID="b8c0386412ff355f6d9cd80dc81303fc6e2e78b940d5c717dcc1fa4038e2a973" exitCode=0 Apr 22 14:19:44.720140 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.719807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerDied","Data":"b8c0386412ff355f6d9cd80dc81303fc6e2e78b940d5c717dcc1fa4038e2a973"} Apr 22 14:19:44.720140 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:44.719838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"136fda21814f548175e7cad26d5c86f5d06cab14a5651b2715a1516ca3d6d84a"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"0a09e8a794b06bdf99619a69030b80695a68e3394823f7875960d178fabb6c10"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"cdc82ec7fbf5e07a2ebf82ed0826a953d414667e220ae99a9ffc1ffb7d0dc360"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"e816e1332063a0469fb7dc11bfd314fba1d6275657511d866386356d9dd241e4"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727658 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"e50ee4b2559461af268da573f6094a108c920b3c15c41eb37879a39d56c94e06"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"5908912de9c08d44ef47a68138a08d1013301d584879062be298dd3a3080564c"} Apr 22 14:19:45.727689 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.727673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"333a0be8-4265-4134-bbe8-011b571fbe9e","Type":"ContainerStarted","Data":"81dcb16842a69a5aff76242b2eb40ad8ecadf52f6a96b0fbe5eec8ba6c53ccf8"} Apr 22 14:19:45.747997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.747959 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265d28e2-13af-475a-be3a-1a2e193562ee" path="/var/lib/kubelet/pods/265d28e2-13af-475a-be3a-1a2e193562ee/volumes" Apr 22 14:19:45.755459 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:45.755414 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.7554022310000001 podStartE2EDuration="1.755402231s" podCreationTimestamp="2026-04-22 14:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:19:45.753960353 +0000 UTC m=+264.601468337" watchObservedRunningTime="2026-04-22 14:19:45.755402231 +0000 UTC m=+264.602910214" Apr 22 14:19:46.611972 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.611934 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612636 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="prometheus" containerID="cri-o://b9728c0e0ac5a16227f73a6bd17c35a5292d4aee7bf94f66106d8a6d91cd1f93" gracePeriod=600 Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612668 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-thanos" containerID="cri-o://daa91764595fe06fbaa85e31e1ae80dbeb2ced4356aefb44c7212fb7901ac2d5" gracePeriod=600 Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612694 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-web" containerID="cri-o://be07e541bb3367dad849690ffda3510ae17af3c6665a2ea6e178c99e353c59f2" gracePeriod=600 Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612723 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy" containerID="cri-o://56a29689cce45540c217d4e1499d91b5aa8ce81a597f5d535619a8449de45628" gracePeriod=600 Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612777 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="config-reloader" containerID="cri-o://49d11125ac3006b23c1ca6b8bc11fd9957c738f3e94cf95478b94c036e867d92" gracePeriod=600 Apr 22 14:19:46.612865 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.612698 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="thanos-sidecar" containerID="cri-o://610a2695d4f4f5c55dcccdb59b2d4d45fdb1e16f5e7f89d4df0dbe519eb21b19" gracePeriod=600 Apr 22 14:19:46.734874 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734847 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="daa91764595fe06fbaa85e31e1ae80dbeb2ced4356aefb44c7212fb7901ac2d5" exitCode=0 Apr 22 14:19:46.734874 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734871 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="56a29689cce45540c217d4e1499d91b5aa8ce81a597f5d535619a8449de45628" exitCode=0 Apr 22 14:19:46.734874 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734877 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="be07e541bb3367dad849690ffda3510ae17af3c6665a2ea6e178c99e353c59f2" exitCode=0 Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734883 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="610a2695d4f4f5c55dcccdb59b2d4d45fdb1e16f5e7f89d4df0dbe519eb21b19" exitCode=0 Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734888 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="49d11125ac3006b23c1ca6b8bc11fd9957c738f3e94cf95478b94c036e867d92" exitCode=0 Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734895 2577 generic.go:358] "Generic (PLEG): container finished" podID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerID="b9728c0e0ac5a16227f73a6bd17c35a5292d4aee7bf94f66106d8a6d91cd1f93" exitCode=0 Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"daa91764595fe06fbaa85e31e1ae80dbeb2ced4356aefb44c7212fb7901ac2d5"} Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"56a29689cce45540c217d4e1499d91b5aa8ce81a597f5d535619a8449de45628"} Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"be07e541bb3367dad849690ffda3510ae17af3c6665a2ea6e178c99e353c59f2"} Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"610a2695d4f4f5c55dcccdb59b2d4d45fdb1e16f5e7f89d4df0dbe519eb21b19"} Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"49d11125ac3006b23c1ca6b8bc11fd9957c738f3e94cf95478b94c036e867d92"} Apr 22 14:19:46.735266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.734989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"b9728c0e0ac5a16227f73a6bd17c35a5292d4aee7bf94f66106d8a6d91cd1f93"} Apr 22 14:19:46.851854 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.851833 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:46.987931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987856 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fzfx\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.987931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987888 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.987931 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987918 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987936 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987954 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.987975 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988001 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988169 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988156 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988248 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988289 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988338 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988342 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988370 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988408 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988404 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988429 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988459 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988484 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988521 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988636 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988555 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\" (UID: \"5734f31b-1afd-4fc0-84e8-533af7ca0af6\") " Apr 22 14:19:46.988891 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988812 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-metrics-client-ca\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:46.988948 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.988907 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:46.989228 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.989196 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:46.990625 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.990595 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:46.990718 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.990685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx" (OuterVolumeSpecName: "kube-api-access-5fzfx") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "kube-api-access-5fzfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:46.991296 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.991254 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.991381 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.991303 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:46.991549 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.991515 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out" (OuterVolumeSpecName: "config-out") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:19:46.992058 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.991867 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:19:46.992058 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.992022 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:46.992781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.992672 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.992781 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.992739 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.992939 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.992826 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.993042 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.993025 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.993193 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.993148 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.993343 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.993323 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config" (OuterVolumeSpecName: "config") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:46.993681 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:46.993662 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:47.001650 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.001632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config" (OuterVolumeSpecName: "web-config") pod "5734f31b-1afd-4fc0-84e8-533af7ca0af6" (UID: "5734f31b-1afd-4fc0-84e8-533af7ca0af6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:47.090021 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.089991 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5fzfx\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-kube-api-access-5fzfx\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090021 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090017 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090031 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-grpc-tls\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090044 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090058 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-metrics-client-certs\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090071 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090084 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-web-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090095 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090107 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090121 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5734f31b-1afd-4fc0-84e8-533af7ca0af6-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090143 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090157 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-config-out\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090203 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5734f31b-1afd-4fc0-84e8-533af7ca0af6-tls-assets\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090217 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090232 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5734f31b-1afd-4fc0-84e8-533af7ca0af6-prometheus-k8s-db\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090246 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-kube-rbac-proxy\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.090259 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.090260 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5734f31b-1afd-4fc0-84e8-533af7ca0af6-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:19:47.741657 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.741623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5734f31b-1afd-4fc0-84e8-533af7ca0af6","Type":"ContainerDied","Data":"0358e9aaeea188e4b33414ae8b851c7aab361a49260e551d86f128be5df28a1e"} Apr 22 14:19:47.741657 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.741658 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.742078 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.741675 2577 scope.go:117] "RemoveContainer" containerID="daa91764595fe06fbaa85e31e1ae80dbeb2ced4356aefb44c7212fb7901ac2d5" Apr 22 14:19:47.749976 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.749955 2577 scope.go:117] "RemoveContainer" containerID="56a29689cce45540c217d4e1499d91b5aa8ce81a597f5d535619a8449de45628" Apr 22 14:19:47.756560 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.756543 2577 scope.go:117] "RemoveContainer" containerID="be07e541bb3367dad849690ffda3510ae17af3c6665a2ea6e178c99e353c59f2" Apr 22 14:19:47.762623 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.762607 2577 scope.go:117] "RemoveContainer" containerID="610a2695d4f4f5c55dcccdb59b2d4d45fdb1e16f5e7f89d4df0dbe519eb21b19" Apr 22 14:19:47.772206 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.770376 2577 scope.go:117] "RemoveContainer" containerID="49d11125ac3006b23c1ca6b8bc11fd9957c738f3e94cf95478b94c036e867d92" Apr 22 14:19:47.772367 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.772229 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:47.776525 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.776506 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:47.780321 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.780303 2577 scope.go:117] "RemoveContainer" containerID="b9728c0e0ac5a16227f73a6bd17c35a5292d4aee7bf94f66106d8a6d91cd1f93" Apr 22 14:19:47.786700 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.786681 2577 scope.go:117] "RemoveContainer" containerID="5252046e1df30316c85bd9fac1bbab7de20041fd9c1946445930d912dd17533e" Apr 22 14:19:47.803102 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803082 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:47.803437 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803423 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="prometheus" Apr 22 14:19:47.803437 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803438 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="prometheus" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803446 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="thanos-sidecar" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803451 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="thanos-sidecar" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803460 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="config-reloader" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803465 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="config-reloader" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803473 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803478 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803491 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-web" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803499 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-web" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803505 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-thanos" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803510 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-thanos" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803522 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="init-config-reloader" Apr 22 14:19:47.803530 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803527 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="init-config-reloader" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803573 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="thanos-sidecar" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803584 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="prometheus" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803590 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803596 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-thanos" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803605 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="config-reloader" Apr 22 14:19:47.803844 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.803612 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" containerName="kube-rbac-proxy-web" Apr 22 14:19:47.808727 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.808712 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.811610 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 14:19:47.811610 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 14:19:47.811852 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 14:19:47.811852 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811717 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 14:19:47.812036 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:19:47.812036 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811858 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 14:19:47.812036 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.811950 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 14:19:47.812296 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812066 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9xx5w\"" Apr 22 14:19:47.812296 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812108 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 14:19:47.812296 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 14:19:47.812296 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 14:19:47.812501 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4r206nra9h7l5\"" Apr 22 14:19:47.812501 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.812359 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 14:19:47.815470 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.815370 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 14:19:47.817620 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.817601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 14:19:47.820942 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.820924 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:47.897354 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897354 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897354 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897594 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897594 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897594 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897594 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjcv\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-kube-api-access-9qjcv\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897732 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897828 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897828 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897828 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897778 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897941 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897941 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897887 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.897941 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.898042 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.898042 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.897995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998513 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998626 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998626 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998626 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998626 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998830 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998830 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjcv\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-kube-api-access-9qjcv\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.998924 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.998990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999066 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999327 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999327 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999327 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999510 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:47.999571 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:47.999508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.002195 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.002146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.002669 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.002643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491164c-d333-4cf0-9826-64a0b694b3fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003805 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003805 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491164c-d333-4cf0-9826-64a0b694b3fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003805 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.003958 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.004018 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.003963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.004121 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.004103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.004302 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.004283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.004737 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.004718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491164c-d333-4cf0-9826-64a0b694b3fc-config\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.007417 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.007399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjcv\" (UniqueName: \"kubernetes.io/projected/e491164c-d333-4cf0-9826-64a0b694b3fc-kube-api-access-9qjcv\") pod \"prometheus-k8s-0\" (UID: \"e491164c-d333-4cf0-9826-64a0b694b3fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.118532 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.118505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:48.238006 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.237978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:19:48.240370 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:19:48.240339 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode491164c_d333_4cf0_9826_64a0b694b3fc.slice/crio-8ad83beab3a9ec8169d328262a7f153eff743f687852cf0fb871a46259ca8a7f WatchSource:0}: Error finding container 8ad83beab3a9ec8169d328262a7f153eff743f687852cf0fb871a46259ca8a7f: Status 404 returned error can't find the container with id 8ad83beab3a9ec8169d328262a7f153eff743f687852cf0fb871a46259ca8a7f Apr 22 14:19:48.745678 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.745649 2577 generic.go:358] "Generic (PLEG): container finished" podID="e491164c-d333-4cf0-9826-64a0b694b3fc" containerID="7e32d3186619b59707038b8a5b2cd092e5887f24007126c4cfc7f178584c8815" exitCode=0 Apr 22 14:19:48.746086 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.745740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerDied","Data":"7e32d3186619b59707038b8a5b2cd092e5887f24007126c4cfc7f178584c8815"} Apr 22 14:19:48.746086 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:48.745778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"8ad83beab3a9ec8169d328262a7f153eff743f687852cf0fb871a46259ca8a7f"} Apr 22 14:19:49.748896 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.748859 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5734f31b-1afd-4fc0-84e8-533af7ca0af6" path="/var/lib/kubelet/pods/5734f31b-1afd-4fc0-84e8-533af7ca0af6/volumes" Apr 22 14:19:49.757096 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"0d155fd6ba7d63298bf3833b1f13b7c3b118d78ccb81a1db590920d58d1aa1bc"} Apr 22 14:19:49.757096 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"bd68d36236cc1adfe6a161eb5a0a9ae08a8af3b51092693b3e4da91ce5bf17c5"} Apr 22 14:19:49.757262 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"6b59d29c51578cc4e8fdd87f27d6d154c94683a0c5e7be0a2b6336bdc1f6987b"} Apr 22 14:19:49.757262 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"13430253b695366f9f3ef6abd66be14ce07142dad9d0fb804ba2d322eb4203db"} Apr 22 14:19:49.757262 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"bd633dc62c692c4cfebe17c5f411e6dec5ffedaf2a68ff8ab2bedd5c516f0001"} Apr 22 14:19:49.757262 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.757131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491164c-d333-4cf0-9826-64a0b694b3fc","Type":"ContainerStarted","Data":"41cf8b52840e82b4fb43026272d3ca521c1f0b797d8bb5f30c725ba5311c63cd"} Apr 22 14:19:49.786879 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:49.786840 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.7868273820000002 podStartE2EDuration="2.786827382s" podCreationTimestamp="2026-04-22 14:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:19:49.784851023 +0000 UTC m=+268.632359032" watchObservedRunningTime="2026-04-22 14:19:49.786827382 +0000 UTC m=+268.634335391" Apr 22 14:19:53.119412 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:19:53.119378 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:20:21.642440 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:20:21.642420 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:20:48.119019 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:20:48.118990 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:20:48.133667 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:20:48.133645 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:20:48.933715 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:20:48.933689 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:24:36.967189 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.967102 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sn2j9"] Apr 22 14:24:36.972788 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.972772 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sn2j9" Apr 22 14:24:36.975617 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.975590 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:24:36.975744 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.975591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:24:36.975744 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.975631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:24:36.976493 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.976443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ffw5m\"" Apr 22 14:24:36.977286 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:36.977265 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sn2j9"] Apr 22 14:24:37.022559 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.022534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmf5\" (UniqueName: \"kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5\") pod \"s3-init-sn2j9\" (UID: \"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c\") " pod="kserve/s3-init-sn2j9" Apr 22 14:24:37.123911 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.123888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmf5\" (UniqueName: \"kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5\") pod \"s3-init-sn2j9\" (UID: \"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c\") " pod="kserve/s3-init-sn2j9" Apr 22 14:24:37.132684 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.132663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmf5\" (UniqueName: \"kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5\") pod \"s3-init-sn2j9\" (UID: \"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c\") " pod="kserve/s3-init-sn2j9" Apr 22 14:24:37.287832 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.287763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sn2j9" Apr 22 14:24:37.409586 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.409561 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sn2j9"] Apr 22 14:24:37.411127 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:24:37.411099 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c10ed12_1279_4d82_b1b0_2f41b76e1f2c.slice/crio-2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6 WatchSource:0}: Error finding container 2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6: Status 404 returned error can't find the container with id 2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6 Apr 22 14:24:37.412873 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.412853 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:24:37.574164 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:37.574090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sn2j9" event={"ID":"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c","Type":"ContainerStarted","Data":"2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6"} Apr 22 14:24:42.591598 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:42.591563 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sn2j9" event={"ID":"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c","Type":"ContainerStarted","Data":"94892848814f991b4c1c1f75e9df957b2d710421dd83f2a7a27c9c5731d9a740"} Apr 22 14:24:42.608642 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:42.608595 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sn2j9" podStartSLOduration=2.385014622 podStartE2EDuration="6.608581226s" podCreationTimestamp="2026-04-22 14:24:36 +0000 UTC" firstStartedPulling="2026-04-22 14:24:37.412975536 +0000 UTC m=+556.260483498" lastFinishedPulling="2026-04-22 14:24:41.636542127 +0000 UTC m=+560.484050102" observedRunningTime="2026-04-22 14:24:42.607366191 +0000 UTC m=+561.454874175" watchObservedRunningTime="2026-04-22 14:24:42.608581226 +0000 UTC m=+561.456089263" Apr 22 14:24:44.598245 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:44.598218 2577 generic.go:358] "Generic (PLEG): container finished" podID="3c10ed12-1279-4d82-b1b0-2f41b76e1f2c" containerID="94892848814f991b4c1c1f75e9df957b2d710421dd83f2a7a27c9c5731d9a740" exitCode=0 Apr 22 14:24:44.598536 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:44.598255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sn2j9" event={"ID":"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c","Type":"ContainerDied","Data":"94892848814f991b4c1c1f75e9df957b2d710421dd83f2a7a27c9c5731d9a740"} Apr 22 14:24:45.731107 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:45.731082 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sn2j9" Apr 22 14:24:45.793246 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:45.793220 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmf5\" (UniqueName: \"kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5\") pod \"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c\" (UID: \"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c\") " Apr 22 14:24:45.795325 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:45.795298 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5" (OuterVolumeSpecName: "kube-api-access-8gmf5") pod "3c10ed12-1279-4d82-b1b0-2f41b76e1f2c" (UID: "3c10ed12-1279-4d82-b1b0-2f41b76e1f2c"). InnerVolumeSpecName "kube-api-access-8gmf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:45.894315 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:45.894258 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8gmf5\" (UniqueName: \"kubernetes.io/projected/3c10ed12-1279-4d82-b1b0-2f41b76e1f2c-kube-api-access-8gmf5\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:24:46.605674 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:46.605646 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sn2j9" Apr 22 14:24:46.605835 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:46.605638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sn2j9" event={"ID":"3c10ed12-1279-4d82-b1b0-2f41b76e1f2c","Type":"ContainerDied","Data":"2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6"} Apr 22 14:24:46.605835 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:24:46.605749 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ba9cccef197b3f3f6e429625ab168ceaf703e2cc1b676d3c5058f0499506ff6" Apr 22 14:38:27.000872 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.000838 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rlx2l/must-gather-hw4gz"] Apr 22 14:38:27.001375 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.001161 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c10ed12-1279-4d82-b1b0-2f41b76e1f2c" containerName="s3-init" Apr 22 14:38:27.001375 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.001171 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c10ed12-1279-4d82-b1b0-2f41b76e1f2c" containerName="s3-init" Apr 22 14:38:27.001375 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.001242 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c10ed12-1279-4d82-b1b0-2f41b76e1f2c" containerName="s3-init" Apr 22 14:38:27.004277 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.004258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.006803 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.006780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rlx2l\"/\"openshift-service-ca.crt\"" Apr 22 14:38:27.006894 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.006815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rlx2l\"/\"kube-root-ca.crt\"" Apr 22 14:38:27.006894 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.006786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rlx2l\"/\"default-dockercfg-xpskl\"" Apr 22 14:38:27.013340 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.013319 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rlx2l/must-gather-hw4gz"] Apr 22 14:38:27.163402 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.163373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nl94\" (UniqueName: \"kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.163560 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.163409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.264523 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.264454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.264669 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.264554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nl94\" (UniqueName: \"kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.264862 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.264843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.272502 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.272480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nl94\" (UniqueName: \"kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94\") pod \"must-gather-hw4gz\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.325565 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.325543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:27.441478 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.441345 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rlx2l/must-gather-hw4gz"] Apr 22 14:38:27.443673 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:38:27.443646 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c15ff6_c6bf_4f8d_a4fb_d7983a09c1c8.slice/crio-985fa788553a3ae52511ea9257b8f81789c9f825d28740052382e5274bd4286d WatchSource:0}: Error finding container 985fa788553a3ae52511ea9257b8f81789c9f825d28740052382e5274bd4286d: Status 404 returned error can't find the container with id 985fa788553a3ae52511ea9257b8f81789c9f825d28740052382e5274bd4286d Apr 22 14:38:27.445308 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.445293 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:38:27.941658 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:27.941629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" event={"ID":"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8","Type":"ContainerStarted","Data":"985fa788553a3ae52511ea9257b8f81789c9f825d28740052382e5274bd4286d"} Apr 22 14:38:32.959553 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:32.959470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" event={"ID":"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8","Type":"ContainerStarted","Data":"62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea"} Apr 22 14:38:32.959553 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:32.959514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" event={"ID":"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8","Type":"ContainerStarted","Data":"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae"} Apr 22 14:38:32.975082 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:32.975032 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" podStartSLOduration=1.72388625 podStartE2EDuration="6.975019027s" podCreationTimestamp="2026-04-22 14:38:26 +0000 UTC" firstStartedPulling="2026-04-22 14:38:27.44541287 +0000 UTC m=+1386.292920831" lastFinishedPulling="2026-04-22 14:38:32.696545644 +0000 UTC m=+1391.544053608" observedRunningTime="2026-04-22 14:38:32.973650867 +0000 UTC m=+1391.821158851" watchObservedRunningTime="2026-04-22 14:38:32.975019027 +0000 UTC m=+1391.822527010" Apr 22 14:38:49.015715 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:49.015684 2577 generic.go:358] "Generic (PLEG): container finished" podID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerID="b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae" exitCode=0 Apr 22 14:38:49.016061 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:49.015755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" event={"ID":"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8","Type":"ContainerDied","Data":"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae"} Apr 22 14:38:49.016106 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:49.016078 2577 scope.go:117] "RemoveContainer" containerID="b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae" Apr 22 14:38:49.965383 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:49.965352 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rlx2l_must-gather-hw4gz_d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8/gather/0.log" Apr 22 14:38:53.221796 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:53.221766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8pz7t_75ed979a-756f-4aa8-938c-caef257181c3/global-pull-secret-syncer/0.log" Apr 22 14:38:53.423512 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:53.423484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zcvnd_ebee87d9-f911-404c-9e1c-6e244d6b60cd/konnectivity-agent/0.log" Apr 22 14:38:53.483118 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:53.483050 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-75.ec2.internal_ab777a15bb250a55fa506fcc23a847b8/haproxy/0.log" Apr 22 14:38:55.451747 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.451706 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rlx2l/must-gather-hw4gz"] Apr 22 14:38:55.452115 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.451941 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="copy" containerID="cri-o://62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea" gracePeriod=2 Apr 22 14:38:55.454648 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.454614 2577 status_manager.go:895] "Failed to get status for pod" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" err="pods \"must-gather-hw4gz\" is forbidden: User \"system:node:ip-10-0-131-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rlx2l\": no relationship found between node 'ip-10-0-131-75.ec2.internal' and this object" Apr 22 14:38:55.456084 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.456052 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rlx2l/must-gather-hw4gz"] Apr 22 14:38:55.676170 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.676150 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rlx2l_must-gather-hw4gz_d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8/copy/0.log" Apr 22 14:38:55.676504 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.676488 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:55.678699 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.678677 2577 status_manager.go:895] "Failed to get status for pod" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" err="pods \"must-gather-hw4gz\" is forbidden: User \"system:node:ip-10-0-131-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rlx2l\": no relationship found between node 'ip-10-0-131-75.ec2.internal' and this object" Apr 22 14:38:55.793896 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.793839 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nl94\" (UniqueName: \"kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94\") pod \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " Apr 22 14:38:55.794008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.793901 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output\") pod \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\" (UID: \"d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8\") " Apr 22 14:38:55.795266 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.795242 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" (UID: "d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:38:55.795828 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.795800 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94" (OuterVolumeSpecName: "kube-api-access-8nl94") pod "d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" (UID: "d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8"). InnerVolumeSpecName "kube-api-access-8nl94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:38:55.895431 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.895404 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nl94\" (UniqueName: \"kubernetes.io/projected/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-kube-api-access-8nl94\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:38:55.895431 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:55.895426 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8-must-gather-output\") on node \"ip-10-0-131-75.ec2.internal\" DevicePath \"\"" Apr 22 14:38:56.036637 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.036610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rlx2l_must-gather-hw4gz_d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8/copy/0.log" Apr 22 14:38:56.036906 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.036885 2577 generic.go:358] "Generic (PLEG): container finished" podID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerID="62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea" exitCode=143 Apr 22 14:38:56.036962 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.036933 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rlx2l/must-gather-hw4gz" Apr 22 14:38:56.036962 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.036944 2577 scope.go:117] "RemoveContainer" containerID="62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea" Apr 22 14:38:56.046748 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.046564 2577 scope.go:117] "RemoveContainer" containerID="b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae" Apr 22 14:38:56.058666 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.058648 2577 scope.go:117] "RemoveContainer" containerID="62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea" Apr 22 14:38:56.058920 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:38:56.058896 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea\": container with ID starting with 62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea not found: ID does not exist" containerID="62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea" Apr 22 14:38:56.058978 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.058930 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea"} err="failed to get container status \"62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea\": rpc error: code = NotFound desc = could not find container \"62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea\": container with ID starting with 62f69de787d581fb0434bc68465ed19a28ed1265ecc803323fc3c3ee5eb188ea not found: ID does not exist" Apr 22 14:38:56.058978 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.058949 2577 scope.go:117] "RemoveContainer" containerID="b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae" Apr 22 14:38:56.059188 ip-10-0-131-75 kubenswrapper[2577]: E0422 14:38:56.059160 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae\": container with ID starting with b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae not found: ID does not exist" containerID="b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae" Apr 22 14:38:56.059236 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.059197 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae"} err="failed to get container status \"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae\": rpc error: code = NotFound desc = could not find container \"b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae\": container with ID starting with b484371c5abd0293ea75fdd0322a64daf4e8d78a105627c861249f79649e1aae not found: ID does not exist" Apr 22 14:38:56.956102 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.956019 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/alertmanager/0.log" Apr 22 14:38:56.976088 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.976065 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/config-reloader/0.log" Apr 22 14:38:56.994949 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:56.994928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/kube-rbac-proxy-web/0.log" Apr 22 14:38:57.033651 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.033629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/kube-rbac-proxy/0.log" Apr 22 14:38:57.072782 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.072763 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/kube-rbac-proxy-metric/0.log" Apr 22 14:38:57.093429 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.093413 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/prom-label-proxy/0.log" Apr 22 14:38:57.115971 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.115941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_333a0be8-4265-4134-bbe8-011b571fbe9e/init-config-reloader/0.log" Apr 22 14:38:57.162905 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.162884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-87cbq_acca7920-73c1-4c87-b10d-8087b0ef338e/cluster-monitoring-operator/0.log" Apr 22 14:38:57.280399 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.280332 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4nfd5_d61d92ca-13d0-4e38-bff7-be18fc721d92/monitoring-plugin/0.log" Apr 22 14:38:57.495393 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.495361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qw96h_474382f8-8953-42c3-ad82-82234aea8a10/node-exporter/0.log" Apr 22 14:38:57.518394 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.518377 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qw96h_474382f8-8953-42c3-ad82-82234aea8a10/kube-rbac-proxy/0.log" Apr 22 14:38:57.539282 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.539228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qw96h_474382f8-8953-42c3-ad82-82234aea8a10/init-textfile/0.log" Apr 22 14:38:57.563941 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.563919 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7nlw_8cd73319-ed5e-4a45-b956-321dea56a78e/kube-rbac-proxy-main/0.log" Apr 22 14:38:57.583938 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.583917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7nlw_8cd73319-ed5e-4a45-b956-321dea56a78e/kube-rbac-proxy-self/0.log" Apr 22 14:38:57.603286 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.603256 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7nlw_8cd73319-ed5e-4a45-b956-321dea56a78e/openshift-state-metrics/0.log" Apr 22 14:38:57.637521 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.637497 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/prometheus/0.log" Apr 22 14:38:57.653124 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.653097 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/config-reloader/0.log" Apr 22 14:38:57.674577 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.674560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/thanos-sidecar/0.log" Apr 22 14:38:57.693258 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.693242 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/kube-rbac-proxy-web/0.log" Apr 22 14:38:57.714091 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.714062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/kube-rbac-proxy/0.log" Apr 22 14:38:57.732678 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.732663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/kube-rbac-proxy-thanos/0.log" Apr 22 14:38:57.748117 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.748098 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" path="/var/lib/kubelet/pods/d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8/volumes" Apr 22 14:38:57.750871 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.750855 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491164c-d333-4cf0-9826-64a0b694b3fc/init-config-reloader/0.log" Apr 22 14:38:57.827305 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.827253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-gzhhm_e355f284-c206-4ce8-ac5f-d48f46066e84/prometheus-operator-admission-webhook/0.log" Apr 22 14:38:57.950231 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.950210 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/thanos-query/0.log" Apr 22 14:38:57.975140 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.975118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/kube-rbac-proxy-web/0.log" Apr 22 14:38:57.999718 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:57.999700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/kube-rbac-proxy/0.log" Apr 22 14:38:58.031415 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:58.031397 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/prom-label-proxy/0.log" Apr 22 14:38:58.057229 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:58.057213 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/kube-rbac-proxy-rules/0.log" Apr 22 14:38:58.086029 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:58.086011 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d854f498b-wj7pn_706bb132-5603-433e-95cd-80baf8a1ae5d/kube-rbac-proxy-metrics/0.log" Apr 22 14:38:59.894668 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:38:59.894639 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-s2tpm_f0b2d52f-ab32-4412-9351-228c6d681e29/download-server/0.log" Apr 22 14:39:00.407286 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407255 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl"] Apr 22 14:39:00.407585 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407574 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="gather" Apr 22 14:39:00.407622 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407587 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="gather" Apr 22 14:39:00.407622 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407604 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="copy" Apr 22 14:39:00.407622 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407609 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="copy" Apr 22 14:39:00.407722 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407658 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="gather" Apr 22 14:39:00.407722 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.407671 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4c15ff6-c6bf-4f8d-a4fb-d7983a09c1c8" containerName="copy" Apr 22 14:39:00.410752 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.410734 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.413907 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.413885 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"kube-root-ca.crt\"" Apr 22 14:39:00.414008 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.413923 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"openshift-service-ca.crt\"" Apr 22 14:39:00.415439 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.415425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bgl78\"/\"default-dockercfg-qvmhq\"" Apr 22 14:39:00.427703 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.427683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl"] Apr 22 14:39:00.533751 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.533725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-podres\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.533882 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.533776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-sys\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.533882 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.533836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-proc\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.533882 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.533864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcn6\" (UniqueName: \"kubernetes.io/projected/5ad15530-184e-40a4-a106-6b9a5d7ff851-kube-api-access-kwcn6\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.533997 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.533888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-lib-modules\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634724 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-lib-modules\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-podres\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-sys\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-proc\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634817 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcn6\" (UniqueName: \"kubernetes.io/projected/5ad15530-184e-40a4-a106-6b9a5d7ff851-kube-api-access-kwcn6\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634983 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-podres\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634983 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-lib-modules\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634983 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-sys\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.634983 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.634929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ad15530-184e-40a4-a106-6b9a5d7ff851-proc\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.644380 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.644360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcn6\" (UniqueName: \"kubernetes.io/projected/5ad15530-184e-40a4-a106-6b9a5d7ff851-kube-api-access-kwcn6\") pod \"perf-node-gather-daemonset-sxnfl\" (UID: \"5ad15530-184e-40a4-a106-6b9a5d7ff851\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.719905 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.719855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:00.829313 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.829288 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl"] Apr 22 14:39:00.831808 ip-10-0-131-75 kubenswrapper[2577]: W0422 14:39:00.831780 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5ad15530_184e_40a4_a106_6b9a5d7ff851.slice/crio-eeb9e426af1c3bfc12cf8055a20cd2fda3f0020dfa26bad5d6511fbef24aeed0 WatchSource:0}: Error finding container eeb9e426af1c3bfc12cf8055a20cd2fda3f0020dfa26bad5d6511fbef24aeed0: Status 404 returned error can't find the container with id eeb9e426af1c3bfc12cf8055a20cd2fda3f0020dfa26bad5d6511fbef24aeed0 Apr 22 14:39:00.987824 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:00.987776 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zt24h_a5aaec7d-088d-41df-9b1e-e0eb09629b1e/dns/0.log" Apr 22 14:39:01.006371 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.006348 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zt24h_a5aaec7d-088d-41df-9b1e-e0eb09629b1e/kube-rbac-proxy/0.log" Apr 22 14:39:01.052452 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.052430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" event={"ID":"5ad15530-184e-40a4-a106-6b9a5d7ff851","Type":"ContainerStarted","Data":"1633ee042ede44f6367f162ca95836ce4350d1150b07c15f3da207588351ff31"} Apr 22 14:39:01.052540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.052460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" event={"ID":"5ad15530-184e-40a4-a106-6b9a5d7ff851","Type":"ContainerStarted","Data":"eeb9e426af1c3bfc12cf8055a20cd2fda3f0020dfa26bad5d6511fbef24aeed0"} Apr 22 14:39:01.052540 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.052505 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:01.069242 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.069217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dtg5l_87f925f7-d447-4a1f-b742-10a72c9ef6a9/dns-node-resolver/0.log" Apr 22 14:39:01.069420 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.069392 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" podStartSLOduration=1.069383325 podStartE2EDuration="1.069383325s" podCreationTimestamp="2026-04-22 14:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:39:01.067707038 +0000 UTC m=+1419.915215023" watchObservedRunningTime="2026-04-22 14:39:01.069383325 +0000 UTC m=+1419.916891307" Apr 22 14:39:01.455923 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:01.455894 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5hkxf_4532406b-2b3c-4280-be31-a1a417b34d6c/node-ca/0.log" Apr 22 14:39:02.130230 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.130200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-88d969974-5t68d_383f416f-aae2-4ddf-82c2-ed791b0c8a02/router/0.log" Apr 22 14:39:02.479710 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.479643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jzpr7_428e51f0-2daf-428e-8b5a-df5ee4eab661/serve-healthcheck-canary/0.log" Apr 22 14:39:02.842339 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.842309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-cl44f_63f7e5c1-d4ff-4d3c-ba1e-425c1585a851/insights-operator/0.log" Apr 22 14:39:02.842492 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.842475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-cl44f_63f7e5c1-d4ff-4d3c-ba1e-425c1585a851/insights-operator/1.log" Apr 22 14:39:02.911318 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.911291 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqwqv_62386449-d154-47a4-b239-b0b6c68f2a85/kube-rbac-proxy/0.log" Apr 22 14:39:02.929405 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.929385 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqwqv_62386449-d154-47a4-b239-b0b6c68f2a85/exporter/0.log" Apr 22 14:39:02.950779 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:02.950759 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fqwqv_62386449-d154-47a4-b239-b0b6c68f2a85/extractor/0.log" Apr 22 14:39:05.110397 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:05.110369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sn2j9_3c10ed12-1279-4d82-b1b0-2f41b76e1f2c/s3-init/0.log" Apr 22 14:39:07.063946 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:07.063920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-sxnfl" Apr 22 14:39:08.672093 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:08.672014 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zbv5q_4ff583ff-248a-4c95-b4a5-3026c7d2f5cc/migrator/0.log" Apr 22 14:39:08.690425 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:08.690406 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zbv5q_4ff583ff-248a-4c95-b4a5-3026c7d2f5cc/graceful-termination/0.log" Apr 22 14:39:08.962249 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:08.962162 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sw58b_c9b18091-6470-4c4d-9813-07ff092aaa8b/kube-storage-version-migrator-operator/1.log" Apr 22 14:39:08.963556 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:08.963532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sw58b_c9b18091-6470-4c4d-9813-07ff092aaa8b/kube-storage-version-migrator-operator/0.log" Apr 22 14:39:09.917336 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:09.917310 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/kube-multus-additional-cni-plugins/0.log" Apr 22 14:39:09.941000 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:09.940978 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/egress-router-binary-copy/0.log" Apr 22 14:39:09.960312 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:09.960293 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/cni-plugins/0.log" Apr 22 14:39:09.980975 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:09.980952 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/bond-cni-plugin/0.log" Apr 22 14:39:10.001282 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.001245 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/routeoverride-cni/0.log" Apr 22 14:39:10.022480 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.022451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/whereabouts-cni-bincopy/0.log" Apr 22 14:39:10.041004 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.040981 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pq7wr_de6e4092-f486-48f9-b9c5-7b146b3d9c83/whereabouts-cni/0.log" Apr 22 14:39:10.239723 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.239689 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bs9cg_dbe24998-6780-4517-aeb1-716266573102/kube-multus/0.log" Apr 22 14:39:10.340594 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.340561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sfm8m_5e382d5b-073e-4cd5-adc4-f9741cc073d8/network-metrics-daemon/0.log" Apr 22 14:39:10.360878 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:10.360844 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sfm8m_5e382d5b-073e-4cd5-adc4-f9741cc073d8/kube-rbac-proxy/0.log" Apr 22 14:39:11.642477 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.642415 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/ovn-controller/0.log" Apr 22 14:39:11.670895 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.670865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/ovn-acl-logging/0.log" Apr 22 14:39:11.693102 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.693076 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/kube-rbac-proxy-node/0.log" Apr 22 14:39:11.716197 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.716148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:39:11.732945 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.732923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/northd/0.log" Apr 22 14:39:11.753104 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.753083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/nbdb/0.log" Apr 22 14:39:11.771975 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.771954 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/sbdb/0.log" Apr 22 14:39:11.919566 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:11.919491 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6zgvp_e13fc5ca-d417-47c6-8b6c-63651dc87d31/ovnkube-controller/0.log" Apr 22 14:39:13.147145 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:13.147119 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dg6v9_9b411dff-3568-43e1-813c-c4ebd140399b/network-check-target-container/0.log" Apr 22 14:39:14.008033 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:14.008006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zf5kn_c56bbf69-677d-48ac-9bdd-3f2234c4ebe1/iptables-alerter/0.log" Apr 22 14:39:14.676230 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:14.676200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x5ds9_77146340-5d2a-4222-813e-ac3db16a7bcc/tuned/0.log" Apr 22 14:39:17.637232 ip-10-0-131-75 kubenswrapper[2577]: I0422 14:39:17.637198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-p4rtj_4c3999a9-a639-47b4-b7ad-f4e6a9fdf38b/service-ca-controller/0.log"