Apr 23 08:12:32.221195 ip-10-0-139-180 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:12:32.746222 ip-10-0-139-180 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:32.746222 ip-10-0-139-180 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:12:32.746222 ip-10-0-139-180 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:32.746222 ip-10-0-139-180 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:12:32.746222 ip-10-0-139-180 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:32.747402 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.747150 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:12:32.749588 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749572 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:32.749588 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749587 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:32.749588 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749591 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749595 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749598 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749601 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749606 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749611 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749614 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749616 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749619 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749622 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749624 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749627 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749630 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749632 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749635 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749638 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749640 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749650 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749653 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749656 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:32.749725 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749658 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749661 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749664 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749668 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749671 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749675 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749678 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749681 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749683 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749686 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749689 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749691 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749694 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749697 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749700 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749703 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749705 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749708 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749711 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:32.750212 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749714 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749717 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749720 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749723 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749726 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749728 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749731 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749734 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749737 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749739 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749742 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749744 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749747 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749749 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749752 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749754 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749757 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749759 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749762 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749765 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:32.750673 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749767 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749770 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749773 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749775 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749778 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749781 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749783 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749786 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749789 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749792 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749794 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749797 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749799 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749803 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749805 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749809 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749812 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749815 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749817 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749820 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:32.751217 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749822 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749825 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749828 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749830 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.749833 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750216 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750220 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750223 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750226 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750229 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750232 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750234 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750237 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750240 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750242 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750245 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750248 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750251 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750254 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750257 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750259 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:32.751694 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750262 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750265 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750267 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750270 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750273 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750276 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750278 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750281 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750283 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750286 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750288 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750291 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750293 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750296 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750298 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750301 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750303 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750306 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750308 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750311 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:32.752206 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750314 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750316 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750319 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750321 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750324 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750326 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750329 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750331 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750334 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750337 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750339 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750342 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750345 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750349 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750354 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750357 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750360 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750364 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750366 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:32.752703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750369 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750372 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750375 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750377 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750380 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750382 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750384 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750387 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750389 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750392 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750394 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750397 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750400 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750402 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750405 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750408 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750411 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750414 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750416 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:32.753184 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750419 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750421 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750424 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750427 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750430 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750433 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750435 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750438 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750440 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750442 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750445 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.750447 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750518 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750525 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750531 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750536 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750540 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750544 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750552 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750556 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750560 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:12:32.753681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750563 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750566 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750570 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750573 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750576 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750579 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750582 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750585 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750588 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750591 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750597 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750600 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750603 2576 flags.go:64] FLAG: --config-dir="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750606 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750609 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750613 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750616 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750619 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750623 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750626 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750629 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750632 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750635 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750638 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750642 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:12:32.754296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750645 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750648 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750651 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750654 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750657 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750662 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750665 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750668 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750671 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750674 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750678 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750681 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750684 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750687 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750691 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750694 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750697 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750700 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750703 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750706 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750709 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750713 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750716 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750719 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750722 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750725 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:12:32.754946 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750728 2576 flags.go:64] FLAG: --help="false" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750731 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-139-180.ec2.internal" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750734 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750737 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750740 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750744 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750748 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750751 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750755 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750758 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750761 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750764 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750775 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750779 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750783 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750785 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750789 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750791 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750794 2576 flags.go:64] FLAG: --lock-file="" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750797 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750800 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750803 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750808 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:12:32.755566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750811 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750815 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750817 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750820 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750824 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750826 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750829 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750833 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750836 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750841 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750844 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750847 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750850 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750853 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750856 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750858 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750862 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750871 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750874 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750878 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750880 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750883 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750889 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750892 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:12:32.756133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750895 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750898 2576 flags.go:64] FLAG: --port="10250" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750901 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750904 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f710796a88ddff86" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750920 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750923 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750926 2576 flags.go:64] FLAG: --register-node="true" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750929 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750933 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750937 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750940 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750942 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750945 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750949 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750952 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750955 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750958 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750961 2576 flags.go:64] FLAG: --runonce="false" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750964 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750967 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750970 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750973 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750976 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750979 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750982 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750985 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:12:32.756732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750990 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750993 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750996 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.750999 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751002 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751005 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751008 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751013 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751016 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751019 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751023 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751026 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751029 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751033 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751036 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751039 2576 flags.go:64] FLAG: --v="2" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751043 2576 flags.go:64] FLAG: --version="false" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751048 2576 flags.go:64] FLAG: --vmodule="" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751052 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751055 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751142 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751145 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751148 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751150 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:32.757379 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751153 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751156 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751158 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751161 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751164 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751167 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751169 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751172 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751177 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751179 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751182 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751184 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751187 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751190 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751192 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751195 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751197 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751200 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751203 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751205 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:32.757965 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751207 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751211 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751214 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751217 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751220 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751222 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751225 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751228 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751231 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751234 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751236 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751239 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751242 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751244 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751247 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751249 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751254 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751257 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751260 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:32.758458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751263 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751267 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751270 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751273 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751276 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751278 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751281 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751283 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751286 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751289 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751291 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751294 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751296 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751299 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751304 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751308 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751310 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751313 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751316 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751318 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:32.758957 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751321 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751324 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751326 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751329 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751332 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751334 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751337 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751339 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751342 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751345 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751347 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751350 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751352 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751356 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751359 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751362 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751364 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751367 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751369 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751372 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:32.759445 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751374 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751377 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.751380 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.751391 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.757675 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.757690 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757737 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757742 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757745 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757748 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757751 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757753 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757756 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757759 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757762 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757764 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:32.759953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757767 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757769 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757772 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757774 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757777 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757780 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757782 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757785 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757788 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757791 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757793 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757796 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757798 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757801 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757803 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757806 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757808 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757811 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757813 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757816 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:32.760397 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757818 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757822 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757825 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757828 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757831 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757833 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757836 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757838 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757841 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757843 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757848 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757852 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757855 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757857 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757860 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757863 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757866 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757869 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757871 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757874 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:32.760870 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757876 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757879 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757883 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757887 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757890 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757893 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757896 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757898 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757901 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757904 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757919 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757922 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757924 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757927 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757931 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757934 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757936 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757939 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757942 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:32.761368 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757944 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757947 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757949 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757952 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757954 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757957 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757959 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757962 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757964 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757967 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757970 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757973 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757976 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757979 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757982 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757984 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:32.761836 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.757987 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.757992 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758089 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758094 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758097 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758100 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758103 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758106 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758109 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758111 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758114 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758117 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758120 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758123 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758125 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:32.762238 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758128 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758131 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758133 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758136 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758138 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758141 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758144 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758146 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758149 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758151 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758154 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758157 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758159 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758162 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758164 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758168 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758173 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758175 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758179 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:32.762636 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758181 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758184 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758186 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758189 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758192 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758194 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758197 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758199 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758203 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758206 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758209 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758212 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758214 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758217 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758220 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758222 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758225 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758227 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758229 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758232 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:32.763118 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758234 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758237 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758239 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758242 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758244 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758247 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758249 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758252 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758255 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758257 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758260 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758262 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758265 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758267 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758270 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758272 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758275 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758277 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758280 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758282 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:32.763598 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758285 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758288 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758290 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758292 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758295 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758298 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758300 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758303 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758306 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758308 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758310 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758313 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758316 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:32.758318 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.758323 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:32.764140 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.759217 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:12:32.764507 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.764251 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:12:32.765245 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.765233 2576 server.go:1019] "Starting client certificate rotation" Apr 23 08:12:32.765346 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.765330 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:32.765378 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.765373 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:32.795362 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.795345 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:32.804544 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.804513 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:32.817055 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.817035 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:12:32.822868 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.822850 2576 log.go:25] "Validated CRI v1 image API" Apr 23 08:12:32.824204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.824187 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:12:32.827049 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.827030 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:32.828681 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.828658 2576 fs.go:135] Filesystem UUIDs: map[46986bb8-d5ff-4661-8c38-906724339b04:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a5380916-e82b-4730-a7f7-42adf048315b:/dev/nvme0n1p3] Apr 23 08:12:32.828733 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.828682 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:12:32.834931 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.834813 2576 manager.go:217] Machine: {Timestamp:2026-04-23 08:12:32.832843543 +0000 UTC m=+0.477628177 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102186 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27e25ca429bfef63029ea40bba6555 SystemUUID:ec27e25c-a429-bfef-6302-9ea40bba6555 BootID:e68639b7-18ec-46d5-b91b-edc9af4ade08 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0f:cd:9b:98:97 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0f:cd:9b:98:97 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:6c:b5:12:b7:36 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:12:32.834931 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.834925 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:12:32.835047 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.835005 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:12:32.836218 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.836194 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:12:32.836355 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.836219 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-180.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:12:32.836403 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.836361 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:12:32.836403 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.836369 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:12:32.836403 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.836382 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:32.837352 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.837343 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:32.838718 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.838708 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:32.838836 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.838827 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:12:32.841362 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.841353 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:12:32.841401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.841370 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:12:32.841401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.841382 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:12:32.841401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.841391 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:12:32.841401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.841399 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:12:32.842499 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.842487 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:32.842544 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.842510 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:32.845593 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.845579 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:12:32.846905 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.846893 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:12:32.848044 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848028 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vxgfb" Apr 23 08:12:32.848603 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848588 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:12:32.848644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848611 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:12:32.848644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848617 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:12:32.848644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848623 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:12:32.848644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848629 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:12:32.848644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848635 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848650 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848656 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848663 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848669 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848683 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:12:32.848775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.848692 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:12:32.850813 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.850802 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:12:32.850864 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.850816 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:12:32.853333 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.853310 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-180.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:12:32.853513 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.853457 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:12:32.853971 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.853942 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-180.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:12:32.855577 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.855565 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:12:32.855630 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.855618 2576 server.go:1295] "Started kubelet" Apr 23 08:12:32.855711 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.855686 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:12:32.855786 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.855750 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:12:32.855816 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.855806 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:12:32.856307 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.856290 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vxgfb" Apr 23 08:12:32.856550 ip-10-0-139-180 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:12:32.857036 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.856972 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:12:32.858422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.858404 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:12:32.862074 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.860949 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-180.ec2.internal.18a8ee324fff7329 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-180.ec2.internal,UID:ip-10-0-139-180.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-180.ec2.internal,},FirstTimestamp:2026-04-23 08:12:32.855577385 +0000 UTC m=+0.500362022,LastTimestamp:2026-04-23 08:12:32.855577385 +0000 UTC m=+0.500362022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-180.ec2.internal,}" Apr 23 08:12:32.864297 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.864281 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:12:32.864372 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.864297 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:32.864991 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.864974 2576 factory.go:55] Registering systemd factory Apr 23 08:12:32.865066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.864994 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:12:32.865066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865048 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:12:32.865066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865050 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:12:32.865195 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865073 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:12:32.865251 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865198 2576 factory.go:153] Registering CRI-O factory Apr 23 08:12:32.865251 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865215 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 08:12:32.865350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865291 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:12:32.865350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865306 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:12:32.865350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865316 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:12:32.865350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865326 2576 factory.go:103] Registering Raw factory Apr 23 08:12:32.865350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.865344 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 08:12:32.865527 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.865385 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:32.866480 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.866463 2576 manager.go:319] Starting recovery of all containers Apr 23 08:12:32.866656 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.866626 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:12:32.876270 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.876100 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:32.877829 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.877815 2576 manager.go:324] Recovery completed Apr 23 08:12:32.878546 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.878529 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-180.ec2.internal\" not found" node="ip-10-0-139-180.ec2.internal" Apr 23 08:12:32.884388 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.884369 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:32.889287 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889272 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:32.889375 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889300 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:32.889375 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889310 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:32.889798 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889783 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:12:32.889798 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889797 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:12:32.889885 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.889814 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:32.892225 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.892214 2576 policy_none.go:49] "None policy: Start" Apr 23 08:12:32.892262 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.892229 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:12:32.892262 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.892239 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:12:32.931500 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931481 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.931508 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931519 2576 server.go:85] "Starting device plugin registration server" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931771 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931782 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931862 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931956 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:32.931967 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.932497 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:12:32.937201 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:32.932524 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.000262 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.000209 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:12:33.001331 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.001316 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:12:33.001410 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.001341 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:12:33.001410 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.001358 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:12:33.001410 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.001365 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:12:33.001410 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.001395 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:12:33.003787 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.003768 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:33.032834 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.032808 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:33.033604 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.033584 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:33.033671 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.033613 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:33.033671 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.033627 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:33.033671 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.033649 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.043901 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.043887 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.043956 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.043919 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-180.ec2.internal\": node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.072114 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.072095 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.102206 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.102159 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal"] Apr 23 08:12:33.102268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.102252 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:33.103000 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.102982 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:33.103089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.103008 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:33.103089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.103020 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:33.104837 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.104825 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:33.104966 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.104952 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.105015 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.104986 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:33.105521 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105504 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:33.105521 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105516 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:33.105623 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105535 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:33.105623 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105546 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:33.105623 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105536 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:33.105623 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.105573 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:33.107450 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.107438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.107514 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.107460 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:33.108137 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.108120 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:33.108190 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.108153 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:33.108190 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.108169 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:33.129473 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.129455 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-180.ec2.internal\" not found" node="ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.133732 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.133718 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-180.ec2.internal\" not found" node="ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.167306 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.167287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.167374 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.167309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.167374 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.167326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c11cee49b20d4343744d29b36ea3100-config\") pod \"kube-apiserver-proxy-ip-10-0-139-180.ec2.internal\" (UID: \"5c11cee49b20d4343744d29b36ea3100\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.172386 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.172371 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.268167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.268167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.268167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c11cee49b20d4343744d29b36ea3100-config\") pod \"kube-apiserver-proxy-ip-10-0-139-180.ec2.internal\" (UID: \"5c11cee49b20d4343744d29b36ea3100\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.268167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.268373 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/437388d855f943d60f9f802849e99eca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal\" (UID: \"437388d855f943d60f9f802849e99eca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.268373 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.268278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c11cee49b20d4343744d29b36ea3100-config\") pod \"kube-apiserver-proxy-ip-10-0-139-180.ec2.internal\" (UID: \"5c11cee49b20d4343744d29b36ea3100\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.272463 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.272448 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.373530 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.373493 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.433707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.433671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.437318 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.437172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.473693 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.473664 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.574265 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.574197 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.674677 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.674647 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.764869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.764838 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:12:33.765451 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.764986 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:33.765451 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.764985 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:33.775225 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.775201 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.858988 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.858943 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:07:32 +0000 UTC" deadline="2027-10-01 23:20:43.224527659 +0000 UTC" Apr 23 08:12:33.858988 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.858984 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12639h8m9.365546866s" Apr 23 08:12:33.864921 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.864892 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:33.874647 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.874625 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:33.875689 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:33.875672 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-180.ec2.internal\" not found" Apr 23 08:12:33.895671 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.895647 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nqp2p" Apr 23 08:12:33.900645 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.900628 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nqp2p" Apr 23 08:12:33.915046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.915027 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:33.948875 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:33.948788 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437388d855f943d60f9f802849e99eca.slice/crio-ff6acf50898ad4a1205c32f168d47852bb4ce7aef208693f92447723fed7cca1 WatchSource:0}: Error finding container ff6acf50898ad4a1205c32f168d47852bb4ce7aef208693f92447723fed7cca1: Status 404 returned error can't find the container with id ff6acf50898ad4a1205c32f168d47852bb4ce7aef208693f92447723fed7cca1 Apr 23 08:12:33.950458 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:33.950437 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c11cee49b20d4343744d29b36ea3100.slice/crio-28f45fed4d3636482203b5910695eec863a86f08ca125012a60281892febd61c WatchSource:0}: Error finding container 28f45fed4d3636482203b5910695eec863a86f08ca125012a60281892febd61c: Status 404 returned error can't find the container with id 28f45fed4d3636482203b5910695eec863a86f08ca125012a60281892febd61c Apr 23 08:12:33.955181 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.955158 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:12:33.964569 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.964551 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.976128 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.976110 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:12:33.977795 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.977781 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" Apr 23 08:12:33.984192 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:33.984174 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:12:34.003825 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.003784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" event={"ID":"437388d855f943d60f9f802849e99eca","Type":"ContainerStarted","Data":"ff6acf50898ad4a1205c32f168d47852bb4ce7aef208693f92447723fed7cca1"} Apr 23 08:12:34.004638 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.004619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" event={"ID":"5c11cee49b20d4343744d29b36ea3100","Type":"ContainerStarted","Data":"28f45fed4d3636482203b5910695eec863a86f08ca125012a60281892febd61c"} Apr 23 08:12:34.179344 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.179318 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:34.632822 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.632796 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:34.734682 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.734653 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:34.843316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.843289 2576 apiserver.go:52] "Watching apiserver" Apr 23 08:12:34.853332 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.853307 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:12:34.854824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.854800 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g7chr","openshift-network-diagnostics/network-check-target-jb8d5","openshift-ovn-kubernetes/ovnkube-node-rgndg","kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal","openshift-dns/node-resolver-rkg5x","openshift-multus/multus-sgplt","openshift-multus/network-metrics-daemon-fflkd","openshift-network-operator/iptables-alerter-ncw6m","kube-system/konnectivity-agent-c7k5b","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw","openshift-cluster-node-tuning-operator/tuned-5xcj5","openshift-image-registry/node-ca-stmzm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal"] Apr 23 08:12:34.859254 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.859224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.859640 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.859466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:34.859640 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.859543 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:34.861354 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.861334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.861936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.861976 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.861992 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.861997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.862049 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:12:34.862094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.862079 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jxws\"" Apr 23 08:12:34.863232 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.863211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.863788 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.863640 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.863788 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.863675 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.863969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.863942 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:12:34.863969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.863951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:12:34.864270 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.864251 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:12:34.864270 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.864266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:12:34.864644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.864628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qz2cb\"" Apr 23 08:12:34.865417 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.865372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.865938 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.865782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7qspq\"" Apr 23 08:12:34.866032 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.866017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.866177 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.866157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.867316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.867284 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:12:34.867408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.867341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nl2hr\"" Apr 23 08:12:34.868368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.868351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:34.868472 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.868446 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:34.870887 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.870862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:34.873302 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.873059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:34.873302 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.873077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.873302 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.873117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:12:34.873302 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.873125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jmfxl\"" Apr 23 08:12:34.873530 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.873510 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875060 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-multus-certs\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-cnibin\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-os-release\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875254 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:12:34.875261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8gssx\"" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-systemd-units\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-log-socket\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-config\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-netns\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-bin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08aba02d-ea63-43f8-9e3d-409a65aa759d-tmp-dir\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.875634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-env-overrides\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmth\" (UniqueName: \"kubernetes.io/projected/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-kube-api-access-qjmth\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cnibin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf4f4\" (UniqueName: \"kubernetes.io/projected/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-kube-api-access-bf4f4\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-netd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-system-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.875969 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-os-release\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.875978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cni-binary-copy\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjr8v\" (UniqueName: \"kubernetes.io/projected/42d0b805-f001-437e-a62c-21317c5168f5-kube-api-access-wjr8v\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-etc-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-kubelet\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-system-cni-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-netns\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-multus\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-kubelet\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-slash\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-ovn\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-bin\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08aba02d-ea63-43f8-9e3d-409a65aa759d-hosts-file\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-conf-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-etc-kubernetes\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-systemd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-var-lib-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-node-log\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-script-lib\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-socket-dir-parent\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-hostroot\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovn-node-metrics-cert\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.876869 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkp48\" (UniqueName: \"kubernetes.io/projected/08aba02d-ea63-43f8-9e3d-409a65aa759d-kube-api-access-hkp48\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.877595 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-k8s-cni-cncf-io\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.877595 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.876946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-daemon-config\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.878384 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.878115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.878384 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.878195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.880218 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.880305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880205 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.880305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:34.880746 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-x5zd7\"" Apr 23 08:12:34.880854 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.880945 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:12:34.881011 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.880835 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmfdz\"" Apr 23 08:12:34.881809 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.881641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.883468 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.882868 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:12:34.883468 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.882965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:12:34.883468 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.883427 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:12:34.884116 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.883798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vkhph\"" Apr 23 08:12:34.901601 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.901577 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:33 +0000 UTC" deadline="2027-12-15 06:34:56.494590216 +0000 UTC" Apr 23 08:12:34.901601 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.901600 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14422h22m21.592993014s" Apr 23 08:12:34.965812 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.965784 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:12:34.977462 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-daemon-config\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.977597 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-multus-certs\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.977597 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-run\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977597 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-lib-modules\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977597 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-host\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977597 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-multus-certs\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkp48\" (UniqueName: \"kubernetes.io/projected/08aba02d-ea63-43f8-9e3d-409a65aa759d-kube-api-access-hkp48\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysconfig\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-etc-tuned\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-tmp\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnzw\" (UniqueName: \"kubernetes.io/projected/54fc82b6-757a-4606-81ce-75c113d9a233-kube-api-access-sdnzw\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxbh6\" (UniqueName: \"kubernetes.io/projected/522a528b-85f9-4e19-a62b-b53d7868c26e-kube-api-access-cxbh6\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-systemd-units\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.977963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-log-socket\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.977988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-config\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-log-socket\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-systemd-units\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-netns\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-netns\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-sys\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmth\" (UniqueName: \"kubernetes.io/projected/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-kube-api-access-qjmth\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf4f4\" (UniqueName: \"kubernetes.io/projected/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-kube-api-access-bf4f4\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/724188f0-71e3-4a41-97c0-51d0f88e6c75-iptables-alerter-script\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-kubernetes\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-systemd\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-netd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-system-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjr8v\" (UniqueName: \"kubernetes.io/projected/42d0b805-f001-437e-a62c-21317c5168f5-kube-api-access-wjr8v\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.978408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rmj\" (UniqueName: \"kubernetes.io/projected/df2ff433-01c3-442f-b962-0dbfe4dd622f-kube-api-access-j2rmj\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-netd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-etc-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-config\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-conf\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-etc-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqhmx\" (UniqueName: \"kubernetes.io/projected/152c2fa0-6d86-426a-bbda-00226a6a9cc0-kube-api-access-hqhmx\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-system-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-netns\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/724188f0-71e3-4a41-97c0-51d0f88e6c75-host-slash\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-netns\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-kubelet\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-ovn\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-bin\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-kubelet\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08aba02d-ea63-43f8-9e3d-409a65aa759d-hosts-file\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-daemon-config\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-cni-bin\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.979197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-ovn\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-etc-kubernetes\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-etc-kubernetes\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvf9m\" (UniqueName: \"kubernetes.io/projected/724188f0-71e3-4a41-97c0-51d0f88e6c75-kube-api-access-pvf9m\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-socket-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08aba02d-ea63-43f8-9e3d-409a65aa759d-hosts-file\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-systemd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-var-lib-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.978972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-systemd\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-var-lib-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-node-log\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-script-lib\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-socket-dir-parent\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-node-log\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovn-node-metrics-cert\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-k8s-cni-cncf-io\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-cnibin\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-os-release\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fb79acc-affd-4077-a7a0-e13654094fad-agent-certs\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-registration-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42d0b805-f001-437e-a62c-21317c5168f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-bin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-run-k8s-cni-cncf-io\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/152c2fa0-6d86-426a-bbda-00226a6a9cc0-host\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08aba02d-ea63-43f8-9e3d-409a65aa759d-tmp-dir\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-cnibin\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-socket-dir-parent\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-env-overrides\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979449 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cnibin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.980841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-var-lib-kubelet\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-os-release\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-device-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-bin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-os-release\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cni-binary-copy\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-os-release\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/152c2fa0-6d86-426a-bbda-00226a6a9cc0-serviceca\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-run-openvswitch\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-kubelet\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-system-cni-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-kubelet\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cnibin\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.979905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.981649 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08aba02d-ea63-43f8-9e3d-409a65aa759d-tmp-dir\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-multus\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-env-overrides\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-host-var-lib-cni-multus\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42d0b805-f001-437e-a62c-21317c5168f5-system-cni-dir\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-slash\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-cni-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-conf-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-slash\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-cni-binary-copy\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovnkube-script-lib\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-modprobe-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-multus-conf-dir\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.982411 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fb79acc-affd-4077-a7a0-e13654094fad-konnectivity-ca\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-sys-fs\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-hostroot\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.980465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-hostroot\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.983234 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.982855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-ovn-node-metrics-cert\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.984500 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.984464 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:34.984500 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.984489 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:34.984500 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.984502 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:34.985473 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:34.984583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:35.484555944 +0000 UTC m=+3.129340569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:34.985805 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.985739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkp48\" (UniqueName: \"kubernetes.io/projected/08aba02d-ea63-43f8-9e3d-409a65aa759d-kube-api-access-hkp48\") pod \"node-resolver-rkg5x\" (UID: \"08aba02d-ea63-43f8-9e3d-409a65aa759d\") " pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:34.986839 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.986795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmth\" (UniqueName: \"kubernetes.io/projected/2c90cc59-7d96-4997-8dd0-3c2ac01d264d-kube-api-access-qjmth\") pod \"ovnkube-node-rgndg\" (UID: \"2c90cc59-7d96-4997-8dd0-3c2ac01d264d\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:34.986947 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.986847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf4f4\" (UniqueName: \"kubernetes.io/projected/18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1-kube-api-access-bf4f4\") pod \"multus-sgplt\" (UID: \"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1\") " pod="openshift-multus/multus-sgplt" Apr 23 08:12:34.987689 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:34.987667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjr8v\" (UniqueName: \"kubernetes.io/projected/42d0b805-f001-437e-a62c-21317c5168f5-kube-api-access-wjr8v\") pod \"multus-additional-cni-plugins-g7chr\" (UID: \"42d0b805-f001-437e-a62c-21317c5168f5\") " pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:35.081256 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxbh6\" (UniqueName: \"kubernetes.io/projected/522a528b-85f9-4e19-a62b-b53d7868c26e-kube-api-access-cxbh6\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.081399 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-sys\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081399 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/724188f0-71e3-4a41-97c0-51d0f88e6c75-iptables-alerter-script\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.081519 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-kubernetes\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081519 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-systemd\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081519 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rmj\" (UniqueName: \"kubernetes.io/projected/df2ff433-01c3-442f-b962-0dbfe4dd622f-kube-api-access-j2rmj\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:35.081519 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-conf\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqhmx\" (UniqueName: \"kubernetes.io/projected/152c2fa0-6d86-426a-bbda-00226a6a9cc0-kube-api-access-hqhmx\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/724188f0-71e3-4a41-97c0-51d0f88e6c75-host-slash\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-kubernetes\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvf9m\" (UniqueName: \"kubernetes.io/projected/724188f0-71e3-4a41-97c0-51d0f88e6c75-kube-api-access-pvf9m\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-systemd\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-conf\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-socket-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/724188f0-71e3-4a41-97c0-51d0f88e6c75-host-slash\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fb79acc-affd-4077-a7a0-e13654094fad-agent-certs\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-registration-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.081701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/152c2fa0-6d86-426a-bbda-00226a6a9cc0-host\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-var-lib-kubelet\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-socket-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-registration-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-device-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/152c2fa0-6d86-426a-bbda-00226a6a9cc0-serviceca\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-etc-selinux\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-var-lib-kubelet\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/152c2fa0-6d86-426a-bbda-00226a6a9cc0-host\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-device-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysctl-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.081972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-sys\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-modprobe-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.082038 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fb79acc-affd-4077-a7a0-e13654094fad-konnectivity-ca\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:35.082197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-modprobe-d\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.082099 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:12:35.582081144 +0000 UTC m=+3.226865786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-sys-fs\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-run\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-lib-modules\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-host\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-sys-fs\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/152c2fa0-6d86-426a-bbda-00226a6a9cc0-serviceca\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/724188f0-71e3-4a41-97c0-51d0f88e6c75-iptables-alerter-script\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-run\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysconfig\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-etc-tuned\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-host\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-tmp\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnzw\" (UniqueName: \"kubernetes.io/projected/54fc82b6-757a-4606-81ce-75c113d9a233-kube-api-access-sdnzw\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-lib-modules\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54fc82b6-757a-4606-81ce-75c113d9a233-etc-sysconfig\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.083046 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.083945 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/522a528b-85f9-4e19-a62b-b53d7868c26e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.083945 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.082626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fb79acc-affd-4077-a7a0-e13654094fad-konnectivity-ca\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:35.084520 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.084477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-etc-tuned\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.084520 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.084507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54fc82b6-757a-4606-81ce-75c113d9a233-tmp\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.084635 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.084607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fb79acc-affd-4077-a7a0-e13654094fad-agent-certs\") pod \"konnectivity-agent-c7k5b\" (UID: \"9fb79acc-affd-4077-a7a0-e13654094fad\") " pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:35.089878 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.089851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqhmx\" (UniqueName: \"kubernetes.io/projected/152c2fa0-6d86-426a-bbda-00226a6a9cc0-kube-api-access-hqhmx\") pod \"node-ca-stmzm\" (UID: \"152c2fa0-6d86-426a-bbda-00226a6a9cc0\") " pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.090496 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.090471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rmj\" (UniqueName: \"kubernetes.io/projected/df2ff433-01c3-442f-b962-0dbfe4dd622f-kube-api-access-j2rmj\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:35.090629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.090474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxbh6\" (UniqueName: \"kubernetes.io/projected/522a528b-85f9-4e19-a62b-b53d7868c26e-kube-api-access-cxbh6\") pod \"aws-ebs-csi-driver-node-fqwjw\" (UID: \"522a528b-85f9-4e19-a62b-b53d7868c26e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.090675 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.090621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvf9m\" (UniqueName: \"kubernetes.io/projected/724188f0-71e3-4a41-97c0-51d0f88e6c75-kube-api-access-pvf9m\") pod \"iptables-alerter-ncw6m\" (UID: \"724188f0-71e3-4a41-97c0-51d0f88e6c75\") " pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.090808 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.090792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnzw\" (UniqueName: \"kubernetes.io/projected/54fc82b6-757a-4606-81ce-75c113d9a233-kube-api-access-sdnzw\") pod \"tuned-5xcj5\" (UID: \"54fc82b6-757a-4606-81ce-75c113d9a233\") " pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.170938 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.170840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g7chr" Apr 23 08:12:35.176842 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.176818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:35.186628 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.186607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rkg5x" Apr 23 08:12:35.192218 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.192199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgplt" Apr 23 08:12:35.202577 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.202547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ncw6m" Apr 23 08:12:35.207115 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.207100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:35.212639 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.212622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" Apr 23 08:12:35.219200 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.219184 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" Apr 23 08:12:35.224695 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.224676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-stmzm" Apr 23 08:12:35.485496 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.485402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:35.485623 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.485559 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:35.485623 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.485577 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:35.485623 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.485588 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:35.485744 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.485662 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:36.485648039 +0000 UTC m=+4.130432665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:35.548153 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.548111 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724188f0_71e3_4a41_97c0_51d0f88e6c75.slice/crio-c2b335bf48f82464c0c39575daaf3ad5d03ef04665c47f3b7f40f07c0c8aa3ec WatchSource:0}: Error finding container c2b335bf48f82464c0c39575daaf3ad5d03ef04665c47f3b7f40f07c0c8aa3ec: Status 404 returned error can't find the container with id c2b335bf48f82464c0c39575daaf3ad5d03ef04665c47f3b7f40f07c0c8aa3ec Apr 23 08:12:35.550471 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.550447 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cb3bcf_df53_4ee2_abe1_5fd5156e3bc1.slice/crio-7c7ac17a2286a250668ae63319c899113487942d7d9ab79e1f2c852c22ba0aed WatchSource:0}: Error finding container 7c7ac17a2286a250668ae63319c899113487942d7d9ab79e1f2c852c22ba0aed: Status 404 returned error can't find the container with id 7c7ac17a2286a250668ae63319c899113487942d7d9ab79e1f2c852c22ba0aed Apr 23 08:12:35.552928 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.552878 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d0b805_f001_437e_a62c_21317c5168f5.slice/crio-c0c0ccd153787979ff50bf29b010c40d321d3b055091cfa556f0641066ece1db WatchSource:0}: Error finding container c0c0ccd153787979ff50bf29b010c40d321d3b055091cfa556f0641066ece1db: Status 404 returned error can't find the container with id c0c0ccd153787979ff50bf29b010c40d321d3b055091cfa556f0641066ece1db Apr 23 08:12:35.553803 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.553618 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb79acc_affd_4077_a7a0_e13654094fad.slice/crio-d806f0893655410ae7ee6d96e9c113186e07501709a1f84a271b98d4121ff0e0 WatchSource:0}: Error finding container d806f0893655410ae7ee6d96e9c113186e07501709a1f84a271b98d4121ff0e0: Status 404 returned error can't find the container with id d806f0893655410ae7ee6d96e9c113186e07501709a1f84a271b98d4121ff0e0 Apr 23 08:12:35.554509 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.554477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54fc82b6_757a_4606_81ce_75c113d9a233.slice/crio-aaf479dc7177c784ad7134578caba98b35b6ea38ffcf9d8c17ecb7a5273db797 WatchSource:0}: Error finding container aaf479dc7177c784ad7134578caba98b35b6ea38ffcf9d8c17ecb7a5273db797: Status 404 returned error can't find the container with id aaf479dc7177c784ad7134578caba98b35b6ea38ffcf9d8c17ecb7a5273db797 Apr 23 08:12:35.555466 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.555346 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522a528b_85f9_4e19_a62b_b53d7868c26e.slice/crio-97b53bb6df0f8c2ddfe5d33daaf9adc7c0bd1031ac337b966551206f84f518f4 WatchSource:0}: Error finding container 97b53bb6df0f8c2ddfe5d33daaf9adc7c0bd1031ac337b966551206f84f518f4: Status 404 returned error can't find the container with id 97b53bb6df0f8c2ddfe5d33daaf9adc7c0bd1031ac337b966551206f84f518f4 Apr 23 08:12:35.556343 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.556320 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08aba02d_ea63_43f8_9e3d_409a65aa759d.slice/crio-f68fbd08a35062c72bb9895c50104d4681ffaea288c0574802d35d6bb00655ec WatchSource:0}: Error finding container f68fbd08a35062c72bb9895c50104d4681ffaea288c0574802d35d6bb00655ec: Status 404 returned error can't find the container with id f68fbd08a35062c72bb9895c50104d4681ffaea288c0574802d35d6bb00655ec Apr 23 08:12:35.556764 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.556707 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod152c2fa0_6d86_426a_bbda_00226a6a9cc0.slice/crio-52df1db53f3de9786a17433af2a4d49668f300f4718b7e3272d55492e2cc17f1 WatchSource:0}: Error finding container 52df1db53f3de9786a17433af2a4d49668f300f4718b7e3272d55492e2cc17f1: Status 404 returned error can't find the container with id 52df1db53f3de9786a17433af2a4d49668f300f4718b7e3272d55492e2cc17f1 Apr 23 08:12:35.557998 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:12:35.557893 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c90cc59_7d96_4997_8dd0_3c2ac01d264d.slice/crio-e532fe7e76ad4ff783907dc7d3b38c21e81b5bbe8d62f41335f5c0f077b03ffb WatchSource:0}: Error finding container e532fe7e76ad4ff783907dc7d3b38c21e81b5bbe8d62f41335f5c0f077b03ffb: Status 404 returned error can't find the container with id e532fe7e76ad4ff783907dc7d3b38c21e81b5bbe8d62f41335f5c0f077b03ffb Apr 23 08:12:35.585996 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.585975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:35.586090 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.586079 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.586150 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:35.586143 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:12:36.586123146 +0000 UTC m=+4.230907768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:35.903164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.902608 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:33 +0000 UTC" deadline="2028-01-24 15:16:32.713111558 +0000 UTC" Apr 23 08:12:35.903164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:35.902880 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15391h3m56.810240342s" Apr 23 08:12:36.013784 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.013747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"e532fe7e76ad4ff783907dc7d3b38c21e81b5bbe8d62f41335f5c0f077b03ffb"} Apr 23 08:12:36.015255 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.015227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rkg5x" event={"ID":"08aba02d-ea63-43f8-9e3d-409a65aa759d","Type":"ContainerStarted","Data":"f68fbd08a35062c72bb9895c50104d4681ffaea288c0574802d35d6bb00655ec"} Apr 23 08:12:36.018386 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.018358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" event={"ID":"522a528b-85f9-4e19-a62b-b53d7868c26e","Type":"ContainerStarted","Data":"97b53bb6df0f8c2ddfe5d33daaf9adc7c0bd1031ac337b966551206f84f518f4"} Apr 23 08:12:36.020757 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.020731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" event={"ID":"54fc82b6-757a-4606-81ce-75c113d9a233","Type":"ContainerStarted","Data":"aaf479dc7177c784ad7134578caba98b35b6ea38ffcf9d8c17ecb7a5273db797"} Apr 23 08:12:36.025380 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.025107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgplt" event={"ID":"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1","Type":"ContainerStarted","Data":"7c7ac17a2286a250668ae63319c899113487942d7d9ab79e1f2c852c22ba0aed"} Apr 23 08:12:36.034238 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.034178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ncw6m" event={"ID":"724188f0-71e3-4a41-97c0-51d0f88e6c75","Type":"ContainerStarted","Data":"c2b335bf48f82464c0c39575daaf3ad5d03ef04665c47f3b7f40f07c0c8aa3ec"} Apr 23 08:12:36.038833 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.038085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" event={"ID":"5c11cee49b20d4343744d29b36ea3100","Type":"ContainerStarted","Data":"a00765d9c16da025c869e619d3d97856773682e9742d34beb40d227a3fd79381"} Apr 23 08:12:36.040175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.040143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-stmzm" event={"ID":"152c2fa0-6d86-426a-bbda-00226a6a9cc0","Type":"ContainerStarted","Data":"52df1db53f3de9786a17433af2a4d49668f300f4718b7e3272d55492e2cc17f1"} Apr 23 08:12:36.048190 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.048163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c7k5b" event={"ID":"9fb79acc-affd-4077-a7a0-e13654094fad","Type":"ContainerStarted","Data":"d806f0893655410ae7ee6d96e9c113186e07501709a1f84a271b98d4121ff0e0"} Apr 23 08:12:36.053560 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.053183 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-180.ec2.internal" podStartSLOduration=3.053168407 podStartE2EDuration="3.053168407s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:12:36.052291628 +0000 UTC m=+3.697076277" watchObservedRunningTime="2026-04-23 08:12:36.053168407 +0000 UTC m=+3.697953052" Apr 23 08:12:36.059758 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.059700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerStarted","Data":"c0c0ccd153787979ff50bf29b010c40d321d3b055091cfa556f0641066ece1db"} Apr 23 08:12:36.495069 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.494380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:36.495069 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.494578 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:36.495069 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.494599 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:36.495069 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.494611 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:36.495069 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.494674 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:38.494654031 +0000 UTC m=+6.139438676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:36.598044 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:36.597421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:36.598044 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.597606 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:36.598044 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:36.597670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:12:38.597649006 +0000 UTC m=+6.242433631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:37.005009 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:37.004978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:37.005457 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:37.005120 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:37.005563 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:37.005525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:37.005668 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:37.005615 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:37.080278 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:37.080237 2576 generic.go:358] "Generic (PLEG): container finished" podID="437388d855f943d60f9f802849e99eca" containerID="2040beb47dff9554e920d3145133f7c7f97e1972de5d606311858c02288086a0" exitCode=0 Apr 23 08:12:37.081024 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:37.080969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" event={"ID":"437388d855f943d60f9f802849e99eca","Type":"ContainerDied","Data":"2040beb47dff9554e920d3145133f7c7f97e1972de5d606311858c02288086a0"} Apr 23 08:12:38.089592 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:38.089507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" event={"ID":"437388d855f943d60f9f802849e99eca","Type":"ContainerStarted","Data":"e82d925b5e7653594a873d1186e0fdcf3bccbf3a5864dee576c968e49890f2e5"} Apr 23 08:12:38.514204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:38.513962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:38.514204 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.514186 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:38.514204 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.514207 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:38.514453 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.514220 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:38.514453 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.514279 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:42.51426071 +0000 UTC m=+10.159045331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:38.615079 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:38.615043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:38.615257 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.615220 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:38.615327 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:38.615313 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:12:42.615293647 +0000 UTC m=+10.260078285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:39.002766 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:39.002660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:39.002766 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:39.002677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:39.003013 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:39.002809 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:39.003013 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:39.002845 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:41.002104 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:41.002012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:41.002576 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:41.002111 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:41.002643 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:41.002624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:41.002833 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:41.002804 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:42.548216 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:42.548177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:42.548689 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.548345 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:42.548689 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.548366 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:42.548689 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.548379 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:42.548689 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.548438 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:50.548422487 +0000 UTC m=+18.193207109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:42.649160 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:42.649126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:42.649343 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.649287 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:42.649411 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:42.649349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:12:50.649330044 +0000 UTC m=+18.294114673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:43.004157 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:43.003436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:43.004157 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:43.003558 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:43.004157 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:43.003971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:43.004157 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:43.004061 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:45.002440 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:45.002357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:45.002829 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:45.002493 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:45.002829 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:45.002554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:45.002829 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:45.002690 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:47.001842 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:47.001809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:47.001842 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:47.001827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:47.002348 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:47.001947 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:47.002348 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:47.002070 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:49.004797 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.004767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:49.005215 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.004771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:49.005215 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:49.004874 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:49.005215 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:49.004995 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:49.708424 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.708375 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-180.ec2.internal" podStartSLOduration=16.708356224 podStartE2EDuration="16.708356224s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:12:38.105597024 +0000 UTC m=+5.750381670" watchObservedRunningTime="2026-04-23 08:12:49.708356224 +0000 UTC m=+17.353140868" Apr 23 08:12:49.708865 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.708844 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vdn4h"] Apr 23 08:12:49.718454 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.718431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.718581 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:49.718508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:49.801863 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.801832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.801863 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.801876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-kubelet-config\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.802083 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.801893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-dbus\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.902720 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.902684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.902886 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.902737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-kubelet-config\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.902886 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.902765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-dbus\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.902886 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.902839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-kubelet-config\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.902886 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:49.902865 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:49.903180 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:49.902905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d194f1f2-2b7e-468a-86d7-142892eaac07-dbus\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:49.903180 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:49.902943 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:50.402923115 +0000 UTC m=+18.047707757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:50.405920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:50.405862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:50.406360 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.406016 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:50.406360 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.406086 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:51.406065575 +0000 UTC m=+19.050850200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:50.607767 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:50.607730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:50.607964 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.607883 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:50.607964 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.607900 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:50.607964 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.607925 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:50.608121 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.607978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.607964859 +0000 UTC m=+34.252749485 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:50.708510 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:50.708415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:50.708667 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.708574 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:50.708667 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:50.708647 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.708627635 +0000 UTC m=+34.353412265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:51.005687 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:51.005620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:51.005839 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:51.005620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:51.005839 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:51.005721 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:51.005839 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:51.005620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:51.005839 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:51.005808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:51.006049 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:51.005870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:51.412815 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:51.412735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:51.413235 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:51.412904 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:51.413235 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:51.412989 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:53.412970796 +0000 UTC m=+21.057755461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.003234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:53.003448 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.003484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:53.003525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.003549 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:53.003607 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:53.003590 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:53.127744 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.127544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-stmzm" event={"ID":"152c2fa0-6d86-426a-bbda-00226a6a9cc0","Type":"ContainerStarted","Data":"53975a01000f1ca79bb5e99659844e109220d94dcb650f91df8d6031fbdd8fc5"} Apr 23 08:12:53.128862 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.128827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c7k5b" event={"ID":"9fb79acc-affd-4077-a7a0-e13654094fad","Type":"ContainerStarted","Data":"0df88126d76f59fb9fcf766ca922f957d909f241b1482202e4ef06a6c85cb4be"} Apr 23 08:12:53.130142 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.130120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerStarted","Data":"f51d2c4f9cbec7b0650333163cc9f4ab81e05c9101bfca91156bece9b084edcc"} Apr 23 08:12:53.131822 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.131802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"497c882647758f9b1eab7305f7e08935d0e11b78ff24a35f8d0f4c21f49fe2c8"} Apr 23 08:12:53.131895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.131828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"478eef662ded70a0d1c9a0cfae2c085e37f03c29c2bb2f4a7dec51188818a29d"} Apr 23 08:12:53.131895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.131841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"ba3029024e656688201c23ddefa926cea3c90923a8ee7ffcf5a7b0a8494ccf76"} Apr 23 08:12:53.133066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.133045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rkg5x" event={"ID":"08aba02d-ea63-43f8-9e3d-409a65aa759d","Type":"ContainerStarted","Data":"0648bd4200f3813641776b54921a9b1448586ff0826ca489f99cd18204d52930"} Apr 23 08:12:53.134320 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.134301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" event={"ID":"522a528b-85f9-4e19-a62b-b53d7868c26e","Type":"ContainerStarted","Data":"16387301234cb2699ce64a4f4be731a9fa8e2032873ab16c769bdb4e81f81a41"} Apr 23 08:12:53.135899 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.135578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" event={"ID":"54fc82b6-757a-4606-81ce-75c113d9a233","Type":"ContainerStarted","Data":"ef492d4d76d2d6308347ef412797e36c24dd53c18e2971312e658219caac5891"} Apr 23 08:12:53.137555 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.137509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgplt" event={"ID":"18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1","Type":"ContainerStarted","Data":"f2ee779242f17befe38be665838898f088f7a0a1d59ece45356e2e7d058e8398"} Apr 23 08:12:53.161030 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.160980 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-stmzm" podStartSLOduration=3.03798008 podStartE2EDuration="20.160964969s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.580771276 +0000 UTC m=+3.225555901" lastFinishedPulling="2026-04-23 08:12:52.703756164 +0000 UTC m=+20.348540790" observedRunningTime="2026-04-23 08:12:53.160822634 +0000 UTC m=+20.805607279" watchObservedRunningTime="2026-04-23 08:12:53.160964969 +0000 UTC m=+20.805749615" Apr 23 08:12:53.178652 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.178605 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sgplt" podStartSLOduration=2.785962686 podStartE2EDuration="20.178592257s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.552387272 +0000 UTC m=+3.197171910" lastFinishedPulling="2026-04-23 08:12:52.94501685 +0000 UTC m=+20.589801481" observedRunningTime="2026-04-23 08:12:53.178150637 +0000 UTC m=+20.822935279" watchObservedRunningTime="2026-04-23 08:12:53.178592257 +0000 UTC m=+20.823376900" Apr 23 08:12:53.192236 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.192201 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5xcj5" podStartSLOduration=3.02809373 podStartE2EDuration="20.192188569s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.556834866 +0000 UTC m=+3.201619488" lastFinishedPulling="2026-04-23 08:12:52.720929689 +0000 UTC m=+20.365714327" observedRunningTime="2026-04-23 08:12:53.192061908 +0000 UTC m=+20.836846551" watchObservedRunningTime="2026-04-23 08:12:53.192188569 +0000 UTC m=+20.836973212" Apr 23 08:12:53.234424 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.234389 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rkg5x" podStartSLOduration=3.0963394969999998 podStartE2EDuration="20.234375308s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.580800933 +0000 UTC m=+3.225585559" lastFinishedPulling="2026-04-23 08:12:52.718836736 +0000 UTC m=+20.363621370" observedRunningTime="2026-04-23 08:12:53.234199973 +0000 UTC m=+20.878984616" watchObservedRunningTime="2026-04-23 08:12:53.234375308 +0000 UTC m=+20.879159951" Apr 23 08:12:53.255167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.255120 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c7k5b" podStartSLOduration=7.893899372 podStartE2EDuration="20.255104787s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.555604669 +0000 UTC m=+3.200389290" lastFinishedPulling="2026-04-23 08:12:47.916810084 +0000 UTC m=+15.561594705" observedRunningTime="2026-04-23 08:12:53.254831788 +0000 UTC m=+20.899616432" watchObservedRunningTime="2026-04-23 08:12:53.255104787 +0000 UTC m=+20.899889432" Apr 23 08:12:53.428230 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:53.428197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:53.428374 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:53.428336 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:53.428431 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:53.428420 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:12:57.428383184 +0000 UTC m=+25.073167807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:54.044151 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.044129 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:12:54.141544 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.141510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"46ead6ac18ddbf157fe65b9b069a158c2b151cd839e8212046aa4dfb8e6b0be3"} Apr 23 08:12:54.141544 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.141547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"9d681345558241a8cd7b842541e8c5fdbe0488bca2cf200da0bd1a060e9a41c7"} Apr 23 08:12:54.141728 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.141557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"602a2a483e33dc320979bcbd5c3012a5c07626fa2cc5af04bf70f05dca52f95e"} Apr 23 08:12:54.142941 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.142894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" event={"ID":"522a528b-85f9-4e19-a62b-b53d7868c26e","Type":"ContainerStarted","Data":"ee6900118f12a8bf9e683ba5673ab6bc7f8d284989e7c064a0ac69a02f90c27e"} Apr 23 08:12:54.144050 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.144025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ncw6m" event={"ID":"724188f0-71e3-4a41-97c0-51d0f88e6c75","Type":"ContainerStarted","Data":"57a5ea2e4edaf54e6577ca073835f2b90a2fdd3b1ec9e61ea95e8a7e0e001a3d"} Apr 23 08:12:54.145297 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.145274 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="f51d2c4f9cbec7b0650333163cc9f4ab81e05c9101bfca91156bece9b084edcc" exitCode=0 Apr 23 08:12:54.145405 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.145377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"f51d2c4f9cbec7b0650333163cc9f4ab81e05c9101bfca91156bece9b084edcc"} Apr 23 08:12:54.158881 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.158849 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ncw6m" podStartSLOduration=3.989830211 podStartE2EDuration="21.158838773s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.549846577 +0000 UTC m=+3.194631202" lastFinishedPulling="2026-04-23 08:12:52.718855134 +0000 UTC m=+20.363639764" observedRunningTime="2026-04-23 08:12:54.158785159 +0000 UTC m=+21.803569804" watchObservedRunningTime="2026-04-23 08:12:54.158838773 +0000 UTC m=+21.803623418" Apr 23 08:12:54.944771 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.944508 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:12:54.044146281Z","UUID":"2403f134-c750-419f-9093-27989c991edb","Handler":null,"Name":"","Endpoint":""} Apr 23 08:12:54.947532 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.947507 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:12:54.947532 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:54.947535 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:12:55.001770 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:55.001739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:55.001964 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:55.001783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:55.001964 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:55.001745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:55.001964 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:55.001865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:55.002131 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:55.001981 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:55.002131 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:55.002074 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:55.149224 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:55.149149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" event={"ID":"522a528b-85f9-4e19-a62b-b53d7868c26e","Type":"ContainerStarted","Data":"671af438aa9db2e378ee87d76ea813dd8aa37b12444f7bcebcd4a892eb82cfdc"} Apr 23 08:12:55.165389 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:55.165344 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fqwjw" podStartSLOduration=2.959847798 podStartE2EDuration="22.165330757s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.557276789 +0000 UTC m=+3.202061420" lastFinishedPulling="2026-04-23 08:12:54.762759755 +0000 UTC m=+22.407544379" observedRunningTime="2026-04-23 08:12:55.165228392 +0000 UTC m=+22.810013057" watchObservedRunningTime="2026-04-23 08:12:55.165330757 +0000 UTC m=+22.810115398" Apr 23 08:12:56.153625 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:56.153589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"d007ed5ac41e8cfd64f4308fac76ac1257bdad28ad3f82920ebf1f0c3bbcc36b"} Apr 23 08:12:57.002595 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.002550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:57.002792 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.002651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:57.002792 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.002674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:57.002934 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:57.002783 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:57.003128 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:57.003084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:57.003240 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:57.003194 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:57.455135 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.455055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:57.455645 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:57.455159 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:57.455645 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:57.455207 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:05.455194938 +0000 UTC m=+33.099979560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:12:57.963073 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.962806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:57.963517 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:57.963490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:58.160928 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:58.160869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" event={"ID":"2c90cc59-7d96-4997-8dd0-3c2ac01d264d","Type":"ContainerStarted","Data":"af02b1b5ee1bd1435b9321450656cbfa50a981b0cdbc583098a67a355d6c5443"} Apr 23 08:12:58.161303 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:58.161217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:58.161663 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:58.161638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c7k5b" Apr 23 08:12:58.185440 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:58.185395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" podStartSLOduration=8.007524682 podStartE2EDuration="25.185377697s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.580742376 +0000 UTC m=+3.225526998" lastFinishedPulling="2026-04-23 08:12:52.758595377 +0000 UTC m=+20.403380013" observedRunningTime="2026-04-23 08:12:58.184279203 +0000 UTC m=+25.829063887" watchObservedRunningTime="2026-04-23 08:12:58.185377697 +0000 UTC m=+25.830162342" Apr 23 08:12:59.001841 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.001808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:59.002395 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.001808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:59.002395 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.001944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:59.002395 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.002044 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:59.002395 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.001811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:59.002395 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.002169 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:12:59.163633 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.163293 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:12:59.164029 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.164008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:59.164165 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.164043 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:59.182887 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.182858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:59.183087 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.182984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:12:59.457390 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.457359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vdn4h"] Apr 23 08:12:59.457582 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.457463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:12:59.457632 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.457596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:12:59.460076 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.460044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fflkd"] Apr 23 08:12:59.460212 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.460136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:12:59.460269 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.460242 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:12:59.472565 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.472547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jb8d5"] Apr 23 08:12:59.472651 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:12:59.472611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:12:59.472705 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:12:59.472686 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:13:00.166081 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:00.166050 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="1e76fe81383b72b89bc200138b244a9783b16ea434e94f652bae1b224879bb77" exitCode=0 Apr 23 08:13:00.166456 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:00.166139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"1e76fe81383b72b89bc200138b244a9783b16ea434e94f652bae1b224879bb77"} Apr 23 08:13:00.166456 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:00.166280 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:13:01.001813 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:01.001783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:01.001813 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:01.001793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:01.002049 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:01.001781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:01.002049 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:01.001919 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:13:01.002049 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:01.002006 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:13:01.002151 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:01.002110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:13:01.168110 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:01.168084 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:13:02.172368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:02.172337 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="b1968ac6423f11bfbbdf2246134852aec3b2f8a19415189ea2eb69ef05ab4c68" exitCode=0 Apr 23 08:13:02.172863 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:02.172407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"b1968ac6423f11bfbbdf2246134852aec3b2f8a19415189ea2eb69ef05ab4c68"} Apr 23 08:13:03.003230 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:03.003200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:03.003369 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:03.003281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:03.003369 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:03.003317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:03.003369 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:03.003333 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:13:03.003464 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:03.003402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:13:03.003517 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:03.003499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:13:03.175885 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:03.175812 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="db5a857fed470a0053767985fb25f9b1b9bdc94218a7f52b0cb8d932a85afd20" exitCode=0 Apr 23 08:13:03.176335 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:03.175895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"db5a857fed470a0053767985fb25f9b1b9bdc94218a7f52b0cb8d932a85afd20"} Apr 23 08:13:04.336445 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:04.336414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:13:04.337009 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:04.336624 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:13:04.348150 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:04.348126 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgndg" Apr 23 08:13:05.001976 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.001940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:05.001976 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.001956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:05.002220 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.002067 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vdn4h" podUID="d194f1f2-2b7e-468a-86d7-142892eaac07" Apr 23 08:13:05.002220 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.002123 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jb8d5" podUID="9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9" Apr 23 08:13:05.002220 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.002159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:05.002375 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.002277 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fflkd" podUID="df2ff433-01c3-442f-b962-0dbfe4dd622f" Apr 23 08:13:05.523860 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.523826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:05.524286 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.523999 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:13:05.524286 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.524069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret podName:d194f1f2-2b7e-468a-86d7-142892eaac07 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:21.524051991 +0000 UTC m=+49.168836612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret") pod "global-pull-secret-syncer-vdn4h" (UID: "d194f1f2-2b7e-468a-86d7-142892eaac07") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:13:05.654644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.654528 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-180.ec2.internal" event="NodeReady" Apr 23 08:13:05.654808 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.654682 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:13:05.699660 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.699628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g69k6"] Apr 23 08:13:05.710056 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.710023 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p8pm2"] Apr 23 08:13:05.710215 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.710193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.712853 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.712823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:13:05.712986 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.712872 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:13:05.713164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.713140 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:13:05.720755 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.720735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8pm2"] Apr 23 08:13:05.720755 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.720756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g69k6"] Apr 23 08:13:05.720935 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.720842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:05.723296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.723270 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:13:05.724564 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.723682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:13:05.724564 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.724213 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:13:05.724725 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.724702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:13:05.826679 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-tmp-dir\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.826873 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrsq\" (UniqueName: \"kubernetes.io/projected/6d10c1ba-7b11-4a83-938c-04443e1047c2-kube-api-access-7zrsq\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:05.826873 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.826873 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xf6\" (UniqueName: \"kubernetes.io/projected/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-kube-api-access-66xf6\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.826873 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:05.826873 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.826869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-config-volume\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.927706 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66xf6\" (UniqueName: \"kubernetes.io/projected/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-kube-api-access-66xf6\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.927706 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:05.928085 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-config-volume\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.928085 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-tmp-dir\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.928085 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrsq\" (UniqueName: \"kubernetes.io/projected/6d10c1ba-7b11-4a83-938c-04443e1047c2-kube-api-access-7zrsq\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:05.928085 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.927902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.928085 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.928033 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:05.928364 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.928092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.428072676 +0000 UTC m=+34.072857305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:05.928364 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.928263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-tmp-dir\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.928473 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.928376 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:05.928473 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:05.928438 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.42842083 +0000 UTC m=+34.073205467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:05.928473 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.928377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-config-volume\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.938373 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.938225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xf6\" (UniqueName: \"kubernetes.io/projected/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-kube-api-access-66xf6\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:05.938486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:05.938284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrsq\" (UniqueName: \"kubernetes.io/projected/6d10c1ba-7b11-4a83-938c-04443e1047c2-kube-api-access-7zrsq\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:06.430990 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:06.430959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:06.431171 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:06.431006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:06.431171 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.431122 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:06.431171 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.431130 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:06.431322 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.431175 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:07.431160868 +0000 UTC m=+35.075945502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:06.431322 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.431187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:07.431181707 +0000 UTC m=+35.075966329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:06.633153 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:06.633116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:06.633560 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.633299 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:13:06.633560 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.633322 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:13:06.633560 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.633335 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c8gxq for pod openshift-network-diagnostics/network-check-target-jb8d5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:06.633560 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.633403 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq podName:9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:38.633384448 +0000 UTC m=+66.278169079 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8gxq" (UniqueName: "kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq") pod "network-check-target-jb8d5" (UID: "9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:06.734560 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:06.734472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:06.734713 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.734642 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:06.734774 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:06.734717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:13:38.734697337 +0000 UTC m=+66.379481965 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:07.001931 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.001838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:07.001931 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.001864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:07.002125 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.001838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:07.004319 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.004292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:07.005268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.005117 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:13:07.005268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.005146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:07.005268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.005185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:07.005268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.005117 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:13:07.005268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.005194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:07.440264 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.440190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:07.440435 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:07.440277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:07.440435 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:07.440339 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:07.440435 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:07.440374 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:07.440435 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:07.440412 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:09.440397433 +0000 UTC m=+37.085182056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:07.440435 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:07.440430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:09.440423625 +0000 UTC m=+37.085208246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:09.190640 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:09.190610 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="9bfb2ff76fc7f1310e9913ab88e690065cb8d213ccd241db7fc85699f9854baf" exitCode=0 Apr 23 08:13:09.191121 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:09.190666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"9bfb2ff76fc7f1310e9913ab88e690065cb8d213ccd241db7fc85699f9854baf"} Apr 23 08:13:09.455461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:09.455423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:09.455603 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:09.455485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:09.455603 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:09.455569 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:09.455603 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:09.455589 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:09.455693 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:09.455634 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:13.455619407 +0000 UTC m=+41.100404029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:09.455693 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:09.455649 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:13.455643476 +0000 UTC m=+41.100428098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:10.195417 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:10.195378 2576 generic.go:358] "Generic (PLEG): container finished" podID="42d0b805-f001-437e-a62c-21317c5168f5" containerID="a210660a48268b1e675bcdd91525eda5ea20687270bf8b9a24458fe6b6aa4210" exitCode=0 Apr 23 08:13:10.195417 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:10.195417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerDied","Data":"a210660a48268b1e675bcdd91525eda5ea20687270bf8b9a24458fe6b6aa4210"} Apr 23 08:13:11.200444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:11.200406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g7chr" event={"ID":"42d0b805-f001-437e-a62c-21317c5168f5","Type":"ContainerStarted","Data":"1443dc52df32aa6a1f6ebee4b8d2ee20d38afe4029fbe87d163a531de6cdea48"} Apr 23 08:13:11.221723 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:11.221676 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g7chr" podStartSLOduration=5.066246969 podStartE2EDuration="38.22166229s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:12:35.554474251 +0000 UTC m=+3.199258875" lastFinishedPulling="2026-04-23 08:13:08.70988957 +0000 UTC m=+36.354674196" observedRunningTime="2026-04-23 08:13:11.220846592 +0000 UTC m=+38.865631237" watchObservedRunningTime="2026-04-23 08:13:11.22166229 +0000 UTC m=+38.866446933" Apr 23 08:13:13.482412 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:13.482375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:13.482805 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:13.482429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:13.482805 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:13.482529 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:13.482805 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:13.482599 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:21.482584546 +0000 UTC m=+49.127369172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:13.482805 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:13.482531 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:13.482805 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:13.482656 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:21.482644653 +0000 UTC m=+49.127429292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:21.539974 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.539934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.539986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.540025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:21.540132 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:21.540141 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:21.540212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:37.540192368 +0000 UTC m=+65.184976997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:21.540489 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:21.540226 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:37.540219977 +0000 UTC m=+65.185004598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:21.543478 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.543450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d194f1f2-2b7e-468a-86d7-142892eaac07-original-pull-secret\") pod \"global-pull-secret-syncer-vdn4h\" (UID: \"d194f1f2-2b7e-468a-86d7-142892eaac07\") " pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:21.712603 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.712575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdn4h" Apr 23 08:13:21.862038 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:21.862008 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vdn4h"] Apr 23 08:13:21.866175 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:13:21.866146 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd194f1f2_2b7e_468a_86d7_142892eaac07.slice/crio-e571d97e4b57bff23639d5dd9d0cb358014a91b4a47fa5bf03f2ff1e29aea097 WatchSource:0}: Error finding container e571d97e4b57bff23639d5dd9d0cb358014a91b4a47fa5bf03f2ff1e29aea097: Status 404 returned error can't find the container with id e571d97e4b57bff23639d5dd9d0cb358014a91b4a47fa5bf03f2ff1e29aea097 Apr 23 08:13:22.222103 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:22.222067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vdn4h" event={"ID":"d194f1f2-2b7e-468a-86d7-142892eaac07","Type":"ContainerStarted","Data":"e571d97e4b57bff23639d5dd9d0cb358014a91b4a47fa5bf03f2ff1e29aea097"} Apr 23 08:13:26.230870 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:26.230778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vdn4h" event={"ID":"d194f1f2-2b7e-468a-86d7-142892eaac07","Type":"ContainerStarted","Data":"07c94b957fc5ab958a6bcd9c396270545ad06a6449631c6ead04544274442641"} Apr 23 08:13:26.245605 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:26.245556 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vdn4h" podStartSLOduration=33.183088872 podStartE2EDuration="37.245543226s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:13:21.867757782 +0000 UTC m=+49.512542416" lastFinishedPulling="2026-04-23 08:13:25.930212148 +0000 UTC m=+53.574996770" observedRunningTime="2026-04-23 08:13:26.244416967 +0000 UTC m=+53.889201624" watchObservedRunningTime="2026-04-23 08:13:26.245543226 +0000 UTC m=+53.890327869" Apr 23 08:13:27.186407 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.186372 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd"] Apr 23 08:13:27.188232 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.188216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.190393 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.190372 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 08:13:27.190393 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.190385 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 08:13:27.190616 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.190604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 08:13:27.191319 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.191300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 08:13:27.195847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.195827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd"] Apr 23 08:13:27.281770 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.281739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.281770 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.281772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ww5\" (UniqueName: \"kubernetes.io/projected/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-kube-api-access-l4ww5\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.282184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.281864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-tmp\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.382548 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.382515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-tmp\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.382659 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.382589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.382659 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.382610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ww5\" (UniqueName: \"kubernetes.io/projected/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-kube-api-access-l4ww5\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.382929 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.382890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-tmp\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.384961 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.384933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.392641 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.392621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ww5\" (UniqueName: \"kubernetes.io/projected/fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8-kube-api-access-l4ww5\") pod \"klusterlet-addon-workmgr-5fbbfc68f9-65bvd\" (UID: \"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.498037 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.497972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:27.621374 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:27.621341 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd"] Apr 23 08:13:27.624923 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:13:27.624888 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa889ff4_c7ea_4c4f_ac16_e3bcf18a05a8.slice/crio-7bd821a7f2cca221d243e927c15b3dccad7d63733a7ff224999e96fbb9b9c2c1 WatchSource:0}: Error finding container 7bd821a7f2cca221d243e927c15b3dccad7d63733a7ff224999e96fbb9b9c2c1: Status 404 returned error can't find the container with id 7bd821a7f2cca221d243e927c15b3dccad7d63733a7ff224999e96fbb9b9c2c1 Apr 23 08:13:28.236523 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:28.236487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" event={"ID":"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8","Type":"ContainerStarted","Data":"7bd821a7f2cca221d243e927c15b3dccad7d63733a7ff224999e96fbb9b9c2c1"} Apr 23 08:13:32.245640 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:32.245594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" event={"ID":"fa889ff4-c7ea-4c4f-ac16-e3bcf18a05a8","Type":"ContainerStarted","Data":"b16e760caca9c4f4d26785aa8e74a1bb427342d8faffb432a0c1166c6029e391"} Apr 23 08:13:32.246096 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:32.245806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:32.247526 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:32.247503 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" Apr 23 08:13:32.261260 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:32.261216 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fbbfc68f9-65bvd" podStartSLOduration=1.33547709 podStartE2EDuration="5.261204838s" podCreationTimestamp="2026-04-23 08:13:27 +0000 UTC" firstStartedPulling="2026-04-23 08:13:27.626610725 +0000 UTC m=+55.271395347" lastFinishedPulling="2026-04-23 08:13:31.552338474 +0000 UTC m=+59.197123095" observedRunningTime="2026-04-23 08:13:32.260451179 +0000 UTC m=+59.905235824" watchObservedRunningTime="2026-04-23 08:13:32.261204838 +0000 UTC m=+59.905989482" Apr 23 08:13:37.557472 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:37.557436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:13:37.557831 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:37.557532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:13:37.557831 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:37.557589 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:37.557831 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:37.557610 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:37.557831 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:37.557673 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:09.557657916 +0000 UTC m=+97.202442553 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:13:37.557831 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:37.557691 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:09.557684147 +0000 UTC m=+97.202468769 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:13:38.665708 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.665663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:38.668309 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.668291 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:38.678158 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.678137 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:38.689160 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.689130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8gxq\" (UniqueName: \"kubernetes.io/projected/9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9-kube-api-access-c8gxq\") pod \"network-check-target-jb8d5\" (UID: \"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9\") " pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:38.766589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.766555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:13:38.768972 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.768950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:38.776890 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:38.776874 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:13:38.776988 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:13:38.776964 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs podName:df2ff433-01c3-442f-b962-0dbfe4dd622f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:42.776942697 +0000 UTC m=+130.421727336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs") pod "network-metrics-daemon-fflkd" (UID: "df2ff433-01c3-442f-b962-0dbfe4dd622f") : secret "metrics-daemon-secret" not found Apr 23 08:13:38.822505 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.822482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:38.830371 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.830350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:38.937509 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:38.937452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jb8d5"] Apr 23 08:13:38.941168 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:13:38.941143 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e38d3e1_2d82_4dcf_a3b0_58b8c0571ff9.slice/crio-20e66d1860cf9c562ac352e6a8dee5bd45349b196234c5a4c99cdd98bb2e160c WatchSource:0}: Error finding container 20e66d1860cf9c562ac352e6a8dee5bd45349b196234c5a4c99cdd98bb2e160c: Status 404 returned error can't find the container with id 20e66d1860cf9c562ac352e6a8dee5bd45349b196234c5a4c99cdd98bb2e160c Apr 23 08:13:39.262955 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:39.262868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jb8d5" event={"ID":"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9","Type":"ContainerStarted","Data":"20e66d1860cf9c562ac352e6a8dee5bd45349b196234c5a4c99cdd98bb2e160c"} Apr 23 08:13:42.269455 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:42.269424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jb8d5" event={"ID":"9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9","Type":"ContainerStarted","Data":"c620405890f6462ae3d8b6808eac08d02450f05aecf08208656645ca72919e0c"} Apr 23 08:13:42.269765 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:42.269550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:13:42.285322 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:42.285279 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jb8d5" podStartSLOduration=66.039618787 podStartE2EDuration="1m9.285266111s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:13:38.943049728 +0000 UTC m=+66.587834353" lastFinishedPulling="2026-04-23 08:13:42.188697054 +0000 UTC m=+69.833481677" observedRunningTime="2026-04-23 08:13:42.284431144 +0000 UTC m=+69.929215792" watchObservedRunningTime="2026-04-23 08:13:42.285266111 +0000 UTC m=+69.930050755" Apr 23 08:13:59.514043 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.513889 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n"] Apr 23 08:13:59.518938 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.518901 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dwbj6"] Apr 23 08:13:59.519058 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.519017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" Apr 23 08:13:59.521229 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.521205 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.521229 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.521218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-dx7kd\"" Apr 23 08:13:59.522016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.522001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.522061 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.522051 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.525610 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.525591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 08:13:59.525831 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.525815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4874m\"" Apr 23 08:13:59.525925 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.525825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:13:59.527839 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.527822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 08:13:59.527944 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.527923 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 08:13:59.530728 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.530710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n"] Apr 23 08:13:59.536831 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.536813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 08:13:59.547565 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.547547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dwbj6"] Apr 23 08:13:59.611260 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.611230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-config\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.611380 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.611278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c078d65-5482-4cf4-96a9-20d4ce24cf24-serving-cert\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.611380 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.611316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-trusted-ca\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.611380 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.611332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdj2\" (UniqueName: \"kubernetes.io/projected/b0e306ba-d423-4c00-810b-cb7950b66fb6-kube-api-access-4tdj2\") pod \"volume-data-source-validator-7c6cbb6c87-4528n\" (UID: \"b0e306ba-d423-4c00-810b-cb7950b66fb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" Apr 23 08:13:59.611380 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.611361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jfm\" (UniqueName: \"kubernetes.io/projected/5c078d65-5482-4cf4-96a9-20d4ce24cf24-kube-api-access-d6jfm\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.712389 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.712333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-trusted-ca\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.712389 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.712395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdj2\" (UniqueName: \"kubernetes.io/projected/b0e306ba-d423-4c00-810b-cb7950b66fb6-kube-api-access-4tdj2\") pod \"volume-data-source-validator-7c6cbb6c87-4528n\" (UID: \"b0e306ba-d423-4c00-810b-cb7950b66fb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" Apr 23 08:13:59.712614 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.712452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jfm\" (UniqueName: \"kubernetes.io/projected/5c078d65-5482-4cf4-96a9-20d4ce24cf24-kube-api-access-d6jfm\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.712614 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.712488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-config\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.712706 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.712607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c078d65-5482-4cf4-96a9-20d4ce24cf24-serving-cert\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.713267 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.713242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-trusted-ca\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.713757 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.713738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c078d65-5482-4cf4-96a9-20d4ce24cf24-config\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.714963 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.714936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c078d65-5482-4cf4-96a9-20d4ce24cf24-serving-cert\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.720155 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.720128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdj2\" (UniqueName: \"kubernetes.io/projected/b0e306ba-d423-4c00-810b-cb7950b66fb6-kube-api-access-4tdj2\") pod \"volume-data-source-validator-7c6cbb6c87-4528n\" (UID: \"b0e306ba-d423-4c00-810b-cb7950b66fb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" Apr 23 08:13:59.720254 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.720136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jfm\" (UniqueName: \"kubernetes.io/projected/5c078d65-5482-4cf4-96a9-20d4ce24cf24-kube-api-access-d6jfm\") pod \"console-operator-9d4b6777b-dwbj6\" (UID: \"5c078d65-5482-4cf4-96a9-20d4ce24cf24\") " pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.829250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.829154 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" Apr 23 08:13:59.834979 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.834954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:13:59.949463 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.949428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n"] Apr 23 08:13:59.952480 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:13:59.952452 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e306ba_d423_4c00_810b_cb7950b66fb6.slice/crio-9360a3bc88c868290e2aab7d5bc2a069161c83ae45291155c7fcb82f0fa45238 WatchSource:0}: Error finding container 9360a3bc88c868290e2aab7d5bc2a069161c83ae45291155c7fcb82f0fa45238: Status 404 returned error can't find the container with id 9360a3bc88c868290e2aab7d5bc2a069161c83ae45291155c7fcb82f0fa45238 Apr 23 08:13:59.964566 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.964542 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dwbj6"] Apr 23 08:13:59.967664 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:13:59.967640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c078d65_5482_4cf4_96a9_20d4ce24cf24.slice/crio-8595e644ada28c7cd9e9ac5d0a960bfe369fdfc5485dbe37289ac4751956794d WatchSource:0}: Error finding container 8595e644ada28c7cd9e9ac5d0a960bfe369fdfc5485dbe37289ac4751956794d: Status 404 returned error can't find the container with id 8595e644ada28c7cd9e9ac5d0a960bfe369fdfc5485dbe37289ac4751956794d Apr 23 08:13:59.987572 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.987551 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85fcb8b9bc-jcrwl"] Apr 23 08:13:59.992179 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.992164 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:13:59.994442 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.994421 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:13:59.994735 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.994718 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:13:59.994827 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.994812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:13:59.995155 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.995140 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-24qw2\"" Apr 23 08:13:59.999741 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.999442 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:13:59.999817 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:13:59.999796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85fcb8b9bc-jcrwl"] Apr 23 08:14:00.115292 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrqj\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115292 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.115512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.115491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216607 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216607 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216792 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216792 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrqj\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216792 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.216889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.216872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.217071 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.217013 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:00.217071 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.217030 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:00.217166 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.217094 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:00.717070823 +0000 UTC m=+88.361855462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:00.217256 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.217236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.217446 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.217424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.217733 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.217718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.219257 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.219241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.219303 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.219280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.225302 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.225284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.225368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.225348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrqj\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.302867 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.302831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" event={"ID":"b0e306ba-d423-4c00-810b-cb7950b66fb6","Type":"ContainerStarted","Data":"9360a3bc88c868290e2aab7d5bc2a069161c83ae45291155c7fcb82f0fa45238"} Apr 23 08:14:00.303765 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.303740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" event={"ID":"5c078d65-5482-4cf4-96a9-20d4ce24cf24","Type":"ContainerStarted","Data":"8595e644ada28c7cd9e9ac5d0a960bfe369fdfc5485dbe37289ac4751956794d"} Apr 23 08:14:00.721604 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:00.721558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:00.722061 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.721699 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:00.722061 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.721724 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:00.722061 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:00.721802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:01.721781047 +0000 UTC m=+89.366565671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:01.731701 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:01.731663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:01.732147 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:01.731804 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:01.732147 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:01.731821 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:01.732147 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:01.731887 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:03.731866274 +0000 UTC m=+91.376650901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:02.309930 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.309892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" event={"ID":"b0e306ba-d423-4c00-810b-cb7950b66fb6","Type":"ContainerStarted","Data":"9b34e6040dd0cabfd93df3d3e953fc85112b6a416b5c960410c89c5d4045bf3f"} Apr 23 08:14:02.327842 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.327801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4528n" podStartSLOduration=1.918520826 podStartE2EDuration="3.327788561s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="2026-04-23 08:13:59.953994196 +0000 UTC m=+87.598778819" lastFinishedPulling="2026-04-23 08:14:01.363261918 +0000 UTC m=+89.008046554" observedRunningTime="2026-04-23 08:14:02.327358715 +0000 UTC m=+89.972143371" watchObservedRunningTime="2026-04-23 08:14:02.327788561 +0000 UTC m=+89.972573204" Apr 23 08:14:02.435980 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.433925 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w"] Apr 23 08:14:02.437309 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.437288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.439700 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.439680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-4j2ht\"" Apr 23 08:14:02.439840 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.439741 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 08:14:02.439956 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.439936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:02.440001 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.439968 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 08:14:02.440001 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.439976 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:02.445454 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.445428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w"] Apr 23 08:14:02.538669 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.538619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw98d\" (UniqueName: \"kubernetes.io/projected/e4d55644-7fda-4d49-b10d-7977a14862de-kube-api-access-pw98d\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.538840 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.538695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4d55644-7fda-4d49-b10d-7977a14862de-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.538840 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.538718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4d55644-7fda-4d49-b10d-7977a14862de-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.640014 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.639979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4d55644-7fda-4d49-b10d-7977a14862de-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.640014 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.640012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4d55644-7fda-4d49-b10d-7977a14862de-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.640304 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.640044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw98d\" (UniqueName: \"kubernetes.io/projected/e4d55644-7fda-4d49-b10d-7977a14862de-kube-api-access-pw98d\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.640622 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.640600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4d55644-7fda-4d49-b10d-7977a14862de-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.642175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.642157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4d55644-7fda-4d49-b10d-7977a14862de-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.648664 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.648638 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw98d\" (UniqueName: \"kubernetes.io/projected/e4d55644-7fda-4d49-b10d-7977a14862de-kube-api-access-pw98d\") pod \"kube-storage-version-migrator-operator-6769c5d45-g944w\" (UID: \"e4d55644-7fda-4d49-b10d-7977a14862de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.764023 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.763999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" Apr 23 08:14:02.871984 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:02.871954 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w"] Apr 23 08:14:02.875808 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:02.875784 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d55644_7fda_4d49_b10d_7977a14862de.slice/crio-2c76434f7d340ff14cab263cd0329f1022b709824884b4f4bba3d284ddc9fdb0 WatchSource:0}: Error finding container 2c76434f7d340ff14cab263cd0329f1022b709824884b4f4bba3d284ddc9fdb0: Status 404 returned error can't find the container with id 2c76434f7d340ff14cab263cd0329f1022b709824884b4f4bba3d284ddc9fdb0 Apr 23 08:14:03.313599 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.313573 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/0.log" Apr 23 08:14:03.313756 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.313611 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" containerID="f3ac84345a36b19ded98cea900a4295392a6c0278ff074a6a7e1dfb0c79f9a4d" exitCode=255 Apr 23 08:14:03.313756 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.313643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" event={"ID":"5c078d65-5482-4cf4-96a9-20d4ce24cf24","Type":"ContainerDied","Data":"f3ac84345a36b19ded98cea900a4295392a6c0278ff074a6a7e1dfb0c79f9a4d"} Apr 23 08:14:03.313972 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.313957 2576 scope.go:117] "RemoveContainer" containerID="f3ac84345a36b19ded98cea900a4295392a6c0278ff074a6a7e1dfb0c79f9a4d" Apr 23 08:14:03.314683 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.314659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" event={"ID":"e4d55644-7fda-4d49-b10d-7977a14862de","Type":"ContainerStarted","Data":"2c76434f7d340ff14cab263cd0329f1022b709824884b4f4bba3d284ddc9fdb0"} Apr 23 08:14:03.749570 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:03.749541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:03.749756 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:03.749692 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:03.749756 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:03.749711 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:03.749874 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:03.749774 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:07.74975453 +0000 UTC m=+95.394539175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:04.318623 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.318595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/1.log" Apr 23 08:14:04.319081 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.319004 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/0.log" Apr 23 08:14:04.319081 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.319036 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" exitCode=255 Apr 23 08:14:04.319177 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.319102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" event={"ID":"5c078d65-5482-4cf4-96a9-20d4ce24cf24","Type":"ContainerDied","Data":"f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d"} Apr 23 08:14:04.319177 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.319146 2576 scope.go:117] "RemoveContainer" containerID="f3ac84345a36b19ded98cea900a4295392a6c0278ff074a6a7e1dfb0c79f9a4d" Apr 23 08:14:04.319378 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.319365 2576 scope.go:117] "RemoveContainer" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" Apr 23 08:14:04.319575 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:04.319554 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dwbj6_openshift-console-operator(5c078d65-5482-4cf4-96a9-20d4ce24cf24)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podUID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" Apr 23 08:14:04.461774 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.461743 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr"] Apr 23 08:14:04.487224 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.487198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr"] Apr 23 08:14:04.487361 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.487294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.489761 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.489742 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:04.489871 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.489780 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 08:14:04.489979 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.489966 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 08:14:04.490205 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.490192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hd9g4\"" Apr 23 08:14:04.490246 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.490222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:04.556929 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.556880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fba226-f299-4daf-9b75-93ade820fb8b-config\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.557096 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.556952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdpx\" (UniqueName: \"kubernetes.io/projected/32fba226-f299-4daf-9b75-93ade820fb8b-kube-api-access-qmdpx\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.557096 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.557023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fba226-f299-4daf-9b75-93ade820fb8b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.657694 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.657665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fba226-f299-4daf-9b75-93ade820fb8b-config\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.657837 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.657700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdpx\" (UniqueName: \"kubernetes.io/projected/32fba226-f299-4daf-9b75-93ade820fb8b-kube-api-access-qmdpx\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.657837 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.657750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fba226-f299-4daf-9b75-93ade820fb8b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.658175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.658156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fba226-f299-4daf-9b75-93ade820fb8b-config\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.659891 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.659873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fba226-f299-4daf-9b75-93ade820fb8b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.667502 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.667482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdpx\" (UniqueName: \"kubernetes.io/projected/32fba226-f299-4daf-9b75-93ade820fb8b-kube-api-access-qmdpx\") pod \"service-ca-operator-d6fc45fc5-v67lr\" (UID: \"32fba226-f299-4daf-9b75-93ade820fb8b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.796082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.796040 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" Apr 23 08:14:04.914722 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:04.914654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr"] Apr 23 08:14:04.918440 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:04.918416 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32fba226_f299_4daf_9b75_93ade820fb8b.slice/crio-5b88ea09b53adef63ee6998dca4422dfeddade79b6ce16d4d672b2d6a3c7acb1 WatchSource:0}: Error finding container 5b88ea09b53adef63ee6998dca4422dfeddade79b6ce16d4d672b2d6a3c7acb1: Status 404 returned error can't find the container with id 5b88ea09b53adef63ee6998dca4422dfeddade79b6ce16d4d672b2d6a3c7acb1 Apr 23 08:14:05.322217 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.322131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" event={"ID":"32fba226-f299-4daf-9b75-93ade820fb8b","Type":"ContainerStarted","Data":"5b88ea09b53adef63ee6998dca4422dfeddade79b6ce16d4d672b2d6a3c7acb1"} Apr 23 08:14:05.323688 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.323660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/1.log" Apr 23 08:14:05.324113 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.324095 2576 scope.go:117] "RemoveContainer" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" Apr 23 08:14:05.324336 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:05.324305 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dwbj6_openshift-console-operator(5c078d65-5482-4cf4-96a9-20d4ce24cf24)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podUID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" Apr 23 08:14:05.339334 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.339315 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg"] Apr 23 08:14:05.343737 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.343718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" Apr 23 08:14:05.346116 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.346096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jvtjl\"" Apr 23 08:14:05.351324 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.351289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg"] Apr 23 08:14:05.409500 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.409473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkg5x_08aba02d-ea63-43f8-9e3d-409a65aa759d/dns-node-resolver/0.log" Apr 23 08:14:05.466244 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.466198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8mk6\" (UniqueName: \"kubernetes.io/projected/39e863f9-7f2e-4939-8af7-7376f6f63bb0-kube-api-access-q8mk6\") pod \"network-check-source-8894fc9bd-fm4xg\" (UID: \"39e863f9-7f2e-4939-8af7-7376f6f63bb0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" Apr 23 08:14:05.567212 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.567173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8mk6\" (UniqueName: \"kubernetes.io/projected/39e863f9-7f2e-4939-8af7-7376f6f63bb0-kube-api-access-q8mk6\") pod \"network-check-source-8894fc9bd-fm4xg\" (UID: \"39e863f9-7f2e-4939-8af7-7376f6f63bb0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" Apr 23 08:14:05.576742 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.576676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8mk6\" (UniqueName: \"kubernetes.io/projected/39e863f9-7f2e-4939-8af7-7376f6f63bb0-kube-api-access-q8mk6\") pod \"network-check-source-8894fc9bd-fm4xg\" (UID: \"39e863f9-7f2e-4939-8af7-7376f6f63bb0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" Apr 23 08:14:05.655583 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.655557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" Apr 23 08:14:05.774490 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:05.774461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg"] Apr 23 08:14:05.777830 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:05.777799 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39e863f9_7f2e_4939_8af7_7376f6f63bb0.slice/crio-1971badcf19b6b925123366483fd57f6a7c4f870ba6e4a67fc706b3c824cd7b1 WatchSource:0}: Error finding container 1971badcf19b6b925123366483fd57f6a7c4f870ba6e4a67fc706b3c824cd7b1: Status 404 returned error can't find the container with id 1971badcf19b6b925123366483fd57f6a7c4f870ba6e4a67fc706b3c824cd7b1 Apr 23 08:14:06.327886 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.327852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" event={"ID":"e4d55644-7fda-4d49-b10d-7977a14862de","Type":"ContainerStarted","Data":"6cc783472ad4ea11967ca246190a4c26d43dedaa3bbada4fd8974a58b0a9d4f0"} Apr 23 08:14:06.329182 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.329160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" event={"ID":"39e863f9-7f2e-4939-8af7-7376f6f63bb0","Type":"ContainerStarted","Data":"4c3405986955881e04f38adc7e60d77061aef5111974111e228c6d00cc9a5dc7"} Apr 23 08:14:06.329182 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.329185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" event={"ID":"39e863f9-7f2e-4939-8af7-7376f6f63bb0","Type":"ContainerStarted","Data":"1971badcf19b6b925123366483fd57f6a7c4f870ba6e4a67fc706b3c824cd7b1"} Apr 23 08:14:06.344720 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.344678 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" podStartSLOduration=1.5755819500000001 podStartE2EDuration="4.344666399s" podCreationTimestamp="2026-04-23 08:14:02 +0000 UTC" firstStartedPulling="2026-04-23 08:14:02.877565292 +0000 UTC m=+90.522349918" lastFinishedPulling="2026-04-23 08:14:05.646649731 +0000 UTC m=+93.291434367" observedRunningTime="2026-04-23 08:14:06.344219902 +0000 UTC m=+93.989004547" watchObservedRunningTime="2026-04-23 08:14:06.344666399 +0000 UTC m=+93.989451043" Apr 23 08:14:06.364471 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.364435 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fm4xg" podStartSLOduration=1.3644221060000001 podStartE2EDuration="1.364422106s" podCreationTimestamp="2026-04-23 08:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:06.363692569 +0000 UTC m=+94.008477212" watchObservedRunningTime="2026-04-23 08:14:06.364422106 +0000 UTC m=+94.009206741" Apr 23 08:14:06.421531 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:06.421508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-stmzm_152c2fa0-6d86-426a-bbda-00226a6a9cc0/node-ca/0.log" Apr 23 08:14:07.787712 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:07.787671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:07.788115 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:07.787842 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:07.788115 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:07.787866 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:07.788115 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:07.787950 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.787928759 +0000 UTC m=+103.432713397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:08.336144 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:08.336058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" event={"ID":"32fba226-f299-4daf-9b75-93ade820fb8b","Type":"ContainerStarted","Data":"2a780e892103eefae0b547bbcaca635f4a960470b8038537710a67bf16492f27"} Apr 23 08:14:08.351903 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:08.351860 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" podStartSLOduration=1.279871104 podStartE2EDuration="4.351846864s" podCreationTimestamp="2026-04-23 08:14:04 +0000 UTC" firstStartedPulling="2026-04-23 08:14:04.920454322 +0000 UTC m=+92.565238944" lastFinishedPulling="2026-04-23 08:14:07.992430068 +0000 UTC m=+95.637214704" observedRunningTime="2026-04-23 08:14:08.351095233 +0000 UTC m=+95.995879876" watchObservedRunningTime="2026-04-23 08:14:08.351846864 +0000 UTC m=+95.996631508" Apr 23 08:14:09.602599 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:09.602556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:14:09.603100 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:09.602640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:14:09.603100 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:09.602686 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:09.603100 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:09.602782 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert podName:6d10c1ba-7b11-4a83-938c-04443e1047c2 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:13.602761205 +0000 UTC m=+161.247545843 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert") pod "ingress-canary-p8pm2" (UID: "6d10c1ba-7b11-4a83-938c-04443e1047c2") : secret "canary-serving-cert" not found Apr 23 08:14:09.603100 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:09.602797 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:09.603100 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:09.602849 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls podName:9ca5f258-f8f2-45d5-8de6-67a2bd1028b3 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:13.602834918 +0000 UTC m=+161.247619544 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls") pod "dns-default-g69k6" (UID: "9ca5f258-f8f2-45d5-8de6-67a2bd1028b3") : secret "dns-default-metrics-tls" not found Apr 23 08:14:09.835370 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:09.835334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:09.835370 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:09.835380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:09.835734 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:09.835723 2576 scope.go:117] "RemoveContainer" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" Apr 23 08:14:09.835889 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:09.835872 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dwbj6_openshift-console-operator(5c078d65-5482-4cf4-96a9-20d4ce24cf24)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podUID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" Apr 23 08:14:12.048516 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.048484 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-h9tbn"] Apr 23 08:14:12.051695 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.051679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.054609 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.054585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 08:14:12.054774 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.054701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hk6c6\"" Apr 23 08:14:12.056002 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.055881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 08:14:12.056002 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.055933 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 08:14:12.056197 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.056112 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 08:14:12.061636 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.061613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-h9tbn"] Apr 23 08:14:12.222290 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.222256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cde94229-f202-4dfb-829e-a8b6643aa642-signing-key\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.222440 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.222389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cde94229-f202-4dfb-829e-a8b6643aa642-signing-cabundle\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.222495 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.222436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbz7t\" (UniqueName: \"kubernetes.io/projected/cde94229-f202-4dfb-829e-a8b6643aa642-kube-api-access-nbz7t\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.323483 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.323416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cde94229-f202-4dfb-829e-a8b6643aa642-signing-cabundle\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.323483 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.323460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbz7t\" (UniqueName: \"kubernetes.io/projected/cde94229-f202-4dfb-829e-a8b6643aa642-kube-api-access-nbz7t\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.323483 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.323480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cde94229-f202-4dfb-829e-a8b6643aa642-signing-key\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.324073 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.324054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cde94229-f202-4dfb-829e-a8b6643aa642-signing-cabundle\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.325759 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.325743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cde94229-f202-4dfb-829e-a8b6643aa642-signing-key\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.332342 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.332316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbz7t\" (UniqueName: \"kubernetes.io/projected/cde94229-f202-4dfb-829e-a8b6643aa642-kube-api-access-nbz7t\") pod \"service-ca-865cb79987-h9tbn\" (UID: \"cde94229-f202-4dfb-829e-a8b6643aa642\") " pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.361854 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.361833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-h9tbn" Apr 23 08:14:12.473834 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:12.473796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-h9tbn"] Apr 23 08:14:12.477456 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:12.477426 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde94229_f202_4dfb_829e_a8b6643aa642.slice/crio-17f8e426cddf8ba85b5e1027927a3cf3c7ad33b313b03a2d1cf48d793b07d6d8 WatchSource:0}: Error finding container 17f8e426cddf8ba85b5e1027927a3cf3c7ad33b313b03a2d1cf48d793b07d6d8: Status 404 returned error can't find the container with id 17f8e426cddf8ba85b5e1027927a3cf3c7ad33b313b03a2d1cf48d793b07d6d8 Apr 23 08:14:13.273634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:13.273599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jb8d5" Apr 23 08:14:13.349606 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:13.349570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-h9tbn" event={"ID":"cde94229-f202-4dfb-829e-a8b6643aa642","Type":"ContainerStarted","Data":"a47589403b720199ee3d86b59e2d844e8ab765e528a595ece8dfc3e3e2baf40a"} Apr 23 08:14:13.349606 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:13.349607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-h9tbn" event={"ID":"cde94229-f202-4dfb-829e-a8b6643aa642","Type":"ContainerStarted","Data":"17f8e426cddf8ba85b5e1027927a3cf3c7ad33b313b03a2d1cf48d793b07d6d8"} Apr 23 08:14:13.367672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:13.367631 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-h9tbn" podStartSLOduration=1.367616385 podStartE2EDuration="1.367616385s" podCreationTimestamp="2026-04-23 08:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:13.366514433 +0000 UTC m=+101.011299068" watchObservedRunningTime="2026-04-23 08:14:13.367616385 +0000 UTC m=+101.012401028" Apr 23 08:14:15.855378 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:15.855339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:15.855761 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:15.855495 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:15.855761 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:15.855518 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fcb8b9bc-jcrwl: secret "image-registry-tls" not found Apr 23 08:14:15.855761 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:15.855576 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls podName:8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:31.855560557 +0000 UTC m=+119.500345178 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls") pod "image-registry-85fcb8b9bc-jcrwl" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216") : secret "image-registry-tls" not found Apr 23 08:14:24.002134 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.002100 2576 scope.go:117] "RemoveContainer" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" Apr 23 08:14:24.376616 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.376585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:14:24.377013 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.376997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/1.log" Apr 23 08:14:24.377100 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.377028 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" containerID="3a50dce06d09601ec342bf9b741d4eb0041add1e61284db4b5200598d4dc65cc" exitCode=255 Apr 23 08:14:24.377100 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.377092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" event={"ID":"5c078d65-5482-4cf4-96a9-20d4ce24cf24","Type":"ContainerDied","Data":"3a50dce06d09601ec342bf9b741d4eb0041add1e61284db4b5200598d4dc65cc"} Apr 23 08:14:24.377205 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.377124 2576 scope.go:117] "RemoveContainer" containerID="f16a284b71314146989377c09196ddc6cec8668b7ecb306b8feb25fe3f98305d" Apr 23 08:14:24.377445 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:24.377430 2576 scope.go:117] "RemoveContainer" containerID="3a50dce06d09601ec342bf9b741d4eb0041add1e61284db4b5200598d4dc65cc" Apr 23 08:14:24.377643 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:24.377617 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dwbj6_openshift-console-operator(5c078d65-5482-4cf4-96a9-20d4ce24cf24)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podUID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" Apr 23 08:14:25.381140 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:25.381112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:14:29.835726 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:29.835676 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:29.835726 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:29.835726 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:29.836208 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:29.836062 2576 scope.go:117] "RemoveContainer" containerID="3a50dce06d09601ec342bf9b741d4eb0041add1e61284db4b5200598d4dc65cc" Apr 23 08:14:29.836262 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:29.836242 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dwbj6_openshift-console-operator(5c078d65-5482-4cf4-96a9-20d4ce24cf24)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podUID="5c078d65-5482-4cf4-96a9-20d4ce24cf24" Apr 23 08:14:31.832717 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.832684 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85fcb8b9bc-jcrwl"] Apr 23 08:14:31.833086 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:31.832871 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" podUID="8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" Apr 23 08:14:31.860736 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.860699 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6"] Apr 23 08:14:31.871917 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.871867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:31.872238 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.872220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:31.874799 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.874773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"image-registry-85fcb8b9bc-jcrwl\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:31.875198 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.875177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 08:14:31.875879 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.875856 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-697bj\"" Apr 23 08:14:31.876003 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.875944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6"] Apr 23 08:14:31.876003 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.875952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 08:14:31.940084 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.940058 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-75b96b49cf-p7vvq"] Apr 23 08:14:31.950334 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.950308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.953228 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.953199 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zwwgv"] Apr 23 08:14:31.972550 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-installation-pull-secrets\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972651 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-trusted-ca\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972651 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-bound-sa-token\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972745 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:31.972745 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:31.972745 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-tls\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972858 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c63121f-0c3a-4415-aa44-6b0db0179fc8-ca-trust-extracted\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972858 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-certificates\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972941 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsdq\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-kube-api-access-lcsdq\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.972941 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.972900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-image-registry-private-configuration\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:31.974729 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.974689 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75b96b49cf-p7vvq"] Apr 23 08:14:31.974729 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.974717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zwwgv"] Apr 23 08:14:31.975006 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.974835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:31.977140 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.977115 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:14:31.977311 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.977202 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vtw6x\"" Apr 23 08:14:31.977311 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.977268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:14:31.977432 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.977349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:14:31.977432 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:31.977355 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:14:32.074073 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:32.074073 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-tls\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7bef3-14ef-4112-840d-8b9820e79e4b-data-volume\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c63121f-0c3a-4415-aa44-6b0db0179fc8-ca-trust-extracted\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7bef3-14ef-4112-840d-8b9820e79e4b-crio-socket\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-certificates\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsdq\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-kube-api-access-lcsdq\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-image-registry-private-configuration\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-installation-pull-secrets\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-trusted-ca\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074734 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.074734 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7bef3-14ef-4112-840d-8b9820e79e4b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.074734 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-bound-sa-token\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.074734 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhss\" (UniqueName: \"kubernetes.io/projected/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-api-access-flhss\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.074960 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.074810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:32.076669 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.076642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sdmb6\" (UID: \"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:32.083829 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.083760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c63121f-0c3a-4415-aa44-6b0db0179fc8-ca-trust-extracted\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.084088 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.084067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-image-registry-private-configuration\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.084235 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.084203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-tls\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.084457 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.084439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-registry-certificates\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.084522 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.084509 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c63121f-0c3a-4415-aa44-6b0db0179fc8-trusted-ca\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.085805 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.085781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-bound-sa-token\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.086057 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.086036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsdq\" (UniqueName: \"kubernetes.io/projected/9c63121f-0c3a-4415-aa44-6b0db0179fc8-kube-api-access-lcsdq\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.086248 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.086232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c63121f-0c3a-4415-aa44-6b0db0179fc8-installation-pull-secrets\") pod \"image-registry-75b96b49cf-p7vvq\" (UID: \"9c63121f-0c3a-4415-aa44-6b0db0179fc8\") " pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.175004 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.174973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7bef3-14ef-4112-840d-8b9820e79e4b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175004 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flhss\" (UniqueName: \"kubernetes.io/projected/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-api-access-flhss\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175235 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7bef3-14ef-4112-840d-8b9820e79e4b-data-volume\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175235 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7bef3-14ef-4112-840d-8b9820e79e4b-crio-socket\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175235 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175235 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d9f7bef3-14ef-4112-840d-8b9820e79e4b-crio-socket\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175487 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d9f7bef3-14ef-4112-840d-8b9820e79e4b-data-volume\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.175677 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.175661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.177334 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.177313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d9f7bef3-14ef-4112-840d-8b9820e79e4b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.183219 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.183195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhss\" (UniqueName: \"kubernetes.io/projected/d9f7bef3-14ef-4112-840d-8b9820e79e4b-kube-api-access-flhss\") pod \"insights-runtime-extractor-zwwgv\" (UID: \"d9f7bef3-14ef-4112-840d-8b9820e79e4b\") " pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.187987 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.187972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" Apr 23 08:14:32.269638 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.261110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:32.286608 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.284511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zwwgv" Apr 23 08:14:32.328362 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.328329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6"] Apr 23 08:14:32.334246 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:32.334193 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf9e2c3_8864_4eb2_8cf6_02d901f1f6a2.slice/crio-48b1df3ae9901b7dda9de840bb9b6387b4b31a2119f56c77342f3863a9c267c7 WatchSource:0}: Error finding container 48b1df3ae9901b7dda9de840bb9b6387b4b31a2119f56c77342f3863a9c267c7: Status 404 returned error can't find the container with id 48b1df3ae9901b7dda9de840bb9b6387b4b31a2119f56c77342f3863a9c267c7 Apr 23 08:14:32.400463 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.400426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" event={"ID":"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2","Type":"ContainerStarted","Data":"48b1df3ae9901b7dda9de840bb9b6387b4b31a2119f56c77342f3863a9c267c7"} Apr 23 08:14:32.400573 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.400496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:32.405785 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.405765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:32.411383 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.411357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75b96b49cf-p7vvq"] Apr 23 08:14:32.415703 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:32.415682 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c63121f_0c3a_4415_aa44_6b0db0179fc8.slice/crio-1e6c7f0ab06482e17e24c7084c3bfdf1eab41337ffc890c092bcabf61045d0c9 WatchSource:0}: Error finding container 1e6c7f0ab06482e17e24c7084c3bfdf1eab41337ffc890c092bcabf61045d0c9: Status 404 returned error can't find the container with id 1e6c7f0ab06482e17e24c7084c3bfdf1eab41337ffc890c092bcabf61045d0c9 Apr 23 08:14:32.427824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.427804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zwwgv"] Apr 23 08:14:32.430354 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:32.430323 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f7bef3_14ef_4112_840d_8b9820e79e4b.slice/crio-f1fb9ebe9650b6a98307def423edacf6551660db118622ba231fb04a0d05bdad WatchSource:0}: Error finding container f1fb9ebe9650b6a98307def423edacf6551660db118622ba231fb04a0d05bdad: Status 404 returned error can't find the container with id f1fb9ebe9650b6a98307def423edacf6551660db118622ba231fb04a0d05bdad Apr 23 08:14:32.479656 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479674 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479697 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxrqj\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479712 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479771 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.479920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets\") pod \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\" (UID: \"8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216\") " Apr 23 08:14:32.480119 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.479982 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:14:32.480119 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.480106 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-certificates\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.480567 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.480515 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:14:32.480567 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.480544 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:14:32.482074 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.482045 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj" (OuterVolumeSpecName: "kube-api-access-qxrqj") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "kube-api-access-qxrqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:14:32.482168 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.482142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:14:32.482252 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.482231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:14:32.482521 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.482489 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:14:32.482733 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.482700 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" (UID: "8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:14:32.581305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581279 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-installation-pull-secrets\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581302 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-bound-sa-token\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581311 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qxrqj\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-kube-api-access-qxrqj\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581480 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581320 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-trusted-ca\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581480 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581329 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-registry-tls\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581480 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581338 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-ca-trust-extracted\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:32.581480 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:32.581346 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216-image-registry-private-configuration\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:14:33.405066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.405030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwwgv" event={"ID":"d9f7bef3-14ef-4112-840d-8b9820e79e4b","Type":"ContainerStarted","Data":"94d34c298e909c20267558a556e317a140cf36f8ecaf5ddbcd8b845dfeb811c5"} Apr 23 08:14:33.405066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.405069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwwgv" event={"ID":"d9f7bef3-14ef-4112-840d-8b9820e79e4b","Type":"ContainerStarted","Data":"cc86fda1c32b871815fb5868377cb363ece0d812d257b6f06950522db524a87b"} Apr 23 08:14:33.405428 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.405080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwwgv" event={"ID":"d9f7bef3-14ef-4112-840d-8b9820e79e4b","Type":"ContainerStarted","Data":"f1fb9ebe9650b6a98307def423edacf6551660db118622ba231fb04a0d05bdad"} Apr 23 08:14:33.406255 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.406234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" event={"ID":"0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2","Type":"ContainerStarted","Data":"fbb525e8285307d78efcbfe3e522ba0fb6d597af6ed405678985396d14fdc565"} Apr 23 08:14:33.407486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.407466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fcb8b9bc-jcrwl" Apr 23 08:14:33.407486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.407474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" event={"ID":"9c63121f-0c3a-4415-aa44-6b0db0179fc8","Type":"ContainerStarted","Data":"852bb9ff11e816ed29577f39d786d0277ae34b1aab320d760b05e5f30c6258df"} Apr 23 08:14:33.407635 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.407499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" event={"ID":"9c63121f-0c3a-4415-aa44-6b0db0179fc8","Type":"ContainerStarted","Data":"1e6c7f0ab06482e17e24c7084c3bfdf1eab41337ffc890c092bcabf61045d0c9"} Apr 23 08:14:33.407635 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.407562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:33.420425 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.420346 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sdmb6" podStartSLOduration=1.4650989220000001 podStartE2EDuration="2.420334998s" podCreationTimestamp="2026-04-23 08:14:31 +0000 UTC" firstStartedPulling="2026-04-23 08:14:32.336553665 +0000 UTC m=+119.981338288" lastFinishedPulling="2026-04-23 08:14:33.291789726 +0000 UTC m=+120.936574364" observedRunningTime="2026-04-23 08:14:33.419889272 +0000 UTC m=+121.064673915" watchObservedRunningTime="2026-04-23 08:14:33.420334998 +0000 UTC m=+121.065119643" Apr 23 08:14:33.442325 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.442279 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" podStartSLOduration=2.442264261 podStartE2EDuration="2.442264261s" podCreationTimestamp="2026-04-23 08:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:33.440763358 +0000 UTC m=+121.085548012" watchObservedRunningTime="2026-04-23 08:14:33.442264261 +0000 UTC m=+121.087048906" Apr 23 08:14:33.465936 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.465893 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85fcb8b9bc-jcrwl"] Apr 23 08:14:33.469778 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:33.469757 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85fcb8b9bc-jcrwl"] Apr 23 08:14:34.233728 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.233697 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b"] Apr 23 08:14:34.236113 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.236088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:34.238447 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.238428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-jxjxg\"" Apr 23 08:14:34.238747 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.238729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 08:14:34.245151 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.244455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b"] Apr 23 08:14:34.296990 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.296959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4bd5b857-ad33-4875-93dc-f093e035eac7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bkk8b\" (UID: \"4bd5b857-ad33-4875-93dc-f093e035eac7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:34.397983 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.397950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4bd5b857-ad33-4875-93dc-f093e035eac7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bkk8b\" (UID: \"4bd5b857-ad33-4875-93dc-f093e035eac7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:34.400543 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.400515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4bd5b857-ad33-4875-93dc-f093e035eac7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bkk8b\" (UID: \"4bd5b857-ad33-4875-93dc-f093e035eac7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:34.548203 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.548184 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:34.663814 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:34.663783 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b"] Apr 23 08:14:34.666880 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:34.666847 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd5b857_ad33_4875_93dc_f093e035eac7.slice/crio-1a9f603fad5ea4a17209c122bc9522db8bd9f59c16e547b30ee5fa9f28fb1038 WatchSource:0}: Error finding container 1a9f603fad5ea4a17209c122bc9522db8bd9f59c16e547b30ee5fa9f28fb1038: Status 404 returned error can't find the container with id 1a9f603fad5ea4a17209c122bc9522db8bd9f59c16e547b30ee5fa9f28fb1038 Apr 23 08:14:35.005443 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:35.005413 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216" path="/var/lib/kubelet/pods/8fc63e5a-7c7e-4dc6-a19e-6b0507d7a216/volumes" Apr 23 08:14:35.414625 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:35.414584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" event={"ID":"4bd5b857-ad33-4875-93dc-f093e035eac7","Type":"ContainerStarted","Data":"1a9f603fad5ea4a17209c122bc9522db8bd9f59c16e547b30ee5fa9f28fb1038"} Apr 23 08:14:35.416710 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:35.416680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwwgv" event={"ID":"d9f7bef3-14ef-4112-840d-8b9820e79e4b","Type":"ContainerStarted","Data":"7c2085b1c6a1a0025b83ff59bb7eb7f295d98ede4d64b60f26519f87f4c52b9b"} Apr 23 08:14:35.454671 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:35.454623 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zwwgv" podStartSLOduration=2.460503745 podStartE2EDuration="4.454605383s" podCreationTimestamp="2026-04-23 08:14:31 +0000 UTC" firstStartedPulling="2026-04-23 08:14:32.538739446 +0000 UTC m=+120.183524068" lastFinishedPulling="2026-04-23 08:14:34.532841081 +0000 UTC m=+122.177625706" observedRunningTime="2026-04-23 08:14:35.454416208 +0000 UTC m=+123.099200853" watchObservedRunningTime="2026-04-23 08:14:35.454605383 +0000 UTC m=+123.099390029" Apr 23 08:14:36.419863 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:36.419824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" event={"ID":"4bd5b857-ad33-4875-93dc-f093e035eac7","Type":"ContainerStarted","Data":"205d69ef3fbcfbe0fc21148278dccce3b0889b9a9442ab312a4f61c5ed80be18"} Apr 23 08:14:36.420318 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:36.420066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:36.424629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:36.424606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" Apr 23 08:14:36.436117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:36.436077 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bkk8b" podStartSLOduration=1.351018 podStartE2EDuration="2.436067064s" podCreationTimestamp="2026-04-23 08:14:34 +0000 UTC" firstStartedPulling="2026-04-23 08:14:34.668622841 +0000 UTC m=+122.313407466" lastFinishedPulling="2026-04-23 08:14:35.753671905 +0000 UTC m=+123.398456530" observedRunningTime="2026-04-23 08:14:36.436025945 +0000 UTC m=+124.080810600" watchObservedRunningTime="2026-04-23 08:14:36.436067064 +0000 UTC m=+124.080851708" Apr 23 08:14:41.660123 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.660089 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2fh9j"] Apr 23 08:14:41.663986 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.663965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.666810 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.666785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74z6\"" Apr 23 08:14:41.667130 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667106 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:14:41.667419 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:14:41.667865 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667850 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:14:41.668082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:14:41.668150 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667860 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:14:41.668150 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.667883 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:14:41.755012 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.754987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-tls\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-node-exporter-wtmp\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04db17f1-1934-4280-804c-b2639a712354-node-exporter-textfile\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvw8\" (UniqueName: \"kubernetes.io/projected/04db17f1-1934-4280-804c-b2639a712354-kube-api-access-8pvw8\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-sys\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755340 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-root\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755340 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755340 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-node-exporter-accelerators-collector-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.755340 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.755317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-metrics-client-ca\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.855819 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.855773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-node-exporter-accelerators-collector-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.855819 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.855826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-metrics-client-ca\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856102 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.855863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-tls\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856102 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.855948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-node-exporter-wtmp\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856102 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04db17f1-1934-4280-804c-b2639a712354-node-exporter-textfile\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856102 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvw8\" (UniqueName: \"kubernetes.io/projected/04db17f1-1934-4280-804c-b2639a712354-kube-api-access-8pvw8\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856102 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-sys\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856345 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-root\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856345 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-sys\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-metrics-client-ca\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04db17f1-1934-4280-804c-b2639a712354-node-exporter-accelerators-collector-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-root\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856755 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04db17f1-1934-4280-804c-b2639a712354-node-exporter-wtmp\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.856755 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.856734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04db17f1-1934-4280-804c-b2639a712354-node-exporter-textfile\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.858444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.858415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.858636 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.858619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04db17f1-1934-4280-804c-b2639a712354-node-exporter-tls\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.863233 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.863208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvw8\" (UniqueName: \"kubernetes.io/projected/04db17f1-1934-4280-804c-b2639a712354-kube-api-access-8pvw8\") pod \"node-exporter-2fh9j\" (UID: \"04db17f1-1934-4280-804c-b2639a712354\") " pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.974107 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:41.974018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2fh9j" Apr 23 08:14:41.985276 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:41.985242 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04db17f1_1934_4280_804c_b2639a712354.slice/crio-ffa837a25cc35100a8e329ff4f3c272d966f6b67621386613b57421f4c544f08 WatchSource:0}: Error finding container ffa837a25cc35100a8e329ff4f3c272d966f6b67621386613b57421f4c544f08: Status 404 returned error can't find the container with id ffa837a25cc35100a8e329ff4f3c272d966f6b67621386613b57421f4c544f08 Apr 23 08:14:42.442163 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.442111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fh9j" event={"ID":"04db17f1-1934-4280-804c-b2639a712354","Type":"ContainerStarted","Data":"ffa837a25cc35100a8e329ff4f3c272d966f6b67621386613b57421f4c544f08"} Apr 23 08:14:42.713509 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.713428 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:14:42.716526 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.716505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.718806 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.718781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:14:42.718961 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.718937 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:14:42.719033 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.718983 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:14:42.719204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719116 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:14:42.719350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719281 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:14:42.719350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719289 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:14:42.719350 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fb9ht\"" Apr 23 08:14:42.719540 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:14:42.719540 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:14:42.719540 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.719414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:14:42.731836 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.731813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:14:42.763479 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763562 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763562 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763642 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763642 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763703 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763703 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763765 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vwc\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763765 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763826 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763858 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763890 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.763890 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.763877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865465 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865465 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865465 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:14:42.865629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49vwc\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.865824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.865808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.867400 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.866718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.867400 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.867031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.867400 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:42.867101 2576 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 08:14:42.867400 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:42.867182 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle podName:d7aa592b-ea0f-484f-ac85-c57aae7ccce8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:43.36716154 +0000 UTC m=+131.011946165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8") : configmap references non-existent config key: ca-bundle.crt Apr 23 08:14:42.867400 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:14:42.867210 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls podName:d7aa592b-ea0f-484f-ac85-c57aae7ccce8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:43.367199037 +0000 UTC m=+131.011983674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8") : secret "alertmanager-main-tls" not found Apr 23 08:14:42.868814 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.868789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.869737 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.869697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.869993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.870088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.870250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.870285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.870316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.870688 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.870554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df2ff433-01c3-442f-b962-0dbfe4dd622f-metrics-certs\") pod \"network-metrics-daemon-fflkd\" (UID: \"df2ff433-01c3-442f-b962-0dbfe4dd622f\") " pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:14:42.871644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.871617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:42.875248 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:42.875225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vwc\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.027034 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.026966 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:14:43.035789 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.035772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fflkd" Apr 23 08:14:43.148051 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.148019 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fflkd"] Apr 23 08:14:43.151953 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:43.151892 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2ff433_01c3_442f_b962_0dbfe4dd622f.slice/crio-4321cd92e7b0e5b654e9e8d1a8026bb2825ba7ee275aa636e71346690a5bb8f8 WatchSource:0}: Error finding container 4321cd92e7b0e5b654e9e8d1a8026bb2825ba7ee275aa636e71346690a5bb8f8: Status 404 returned error can't find the container with id 4321cd92e7b0e5b654e9e8d1a8026bb2825ba7ee275aa636e71346690a5bb8f8 Apr 23 08:14:43.369268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.369163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.369268 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.369245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.370115 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.370094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.371629 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.371606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.445570 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.445533 2576 generic.go:358] "Generic (PLEG): container finished" podID="04db17f1-1934-4280-804c-b2639a712354" containerID="9f13e52a458a1bb19cfc7e6428b05bcf2cbc1de68b605d077ed7d199a3aedede" exitCode=0 Apr 23 08:14:43.445740 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.445624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fh9j" event={"ID":"04db17f1-1934-4280-804c-b2639a712354","Type":"ContainerDied","Data":"9f13e52a458a1bb19cfc7e6428b05bcf2cbc1de68b605d077ed7d199a3aedede"} Apr 23 08:14:43.446736 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.446712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fflkd" event={"ID":"df2ff433-01c3-442f-b962-0dbfe4dd622f","Type":"ContainerStarted","Data":"4321cd92e7b0e5b654e9e8d1a8026bb2825ba7ee275aa636e71346690a5bb8f8"} Apr 23 08:14:43.626999 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.626969 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:14:43.772274 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:43.772235 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:14:43.775952 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:43.775923 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7aa592b_ea0f_484f_ac85_c57aae7ccce8.slice/crio-05ffc45f4db9ec7eb3d9c9f18d460d438fbe4494ae1dd99faf71f1df5d07b55b WatchSource:0}: Error finding container 05ffc45f4db9ec7eb3d9c9f18d460d438fbe4494ae1dd99faf71f1df5d07b55b: Status 404 returned error can't find the container with id 05ffc45f4db9ec7eb3d9c9f18d460d438fbe4494ae1dd99faf71f1df5d07b55b Apr 23 08:14:44.451051 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.451017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"05ffc45f4db9ec7eb3d9c9f18d460d438fbe4494ae1dd99faf71f1df5d07b55b"} Apr 23 08:14:44.452527 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.452498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fflkd" event={"ID":"df2ff433-01c3-442f-b962-0dbfe4dd622f","Type":"ContainerStarted","Data":"4b63cb488748bf7d3d34d8a156a540bf17c01db7c5b940904eacfcda0d7877dc"} Apr 23 08:14:44.452643 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.452533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fflkd" event={"ID":"df2ff433-01c3-442f-b962-0dbfe4dd622f","Type":"ContainerStarted","Data":"422d4edc15a375cbb0125022019920493564006e34ccdef1aa971c4e4e8afc58"} Apr 23 08:14:44.454141 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.454118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fh9j" event={"ID":"04db17f1-1934-4280-804c-b2639a712354","Type":"ContainerStarted","Data":"e00a7317dc57de69a808414c91f720d2e93d5e8b7628b3fec94b4397b2385e77"} Apr 23 08:14:44.454229 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.454149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fh9j" event={"ID":"04db17f1-1934-4280-804c-b2639a712354","Type":"ContainerStarted","Data":"25cac00739df62a365ea1529f9a102c2dff023ed0b161453a04a75e931ce3ff6"} Apr 23 08:14:44.467232 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.467188 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fflkd" podStartSLOduration=130.528281515 podStartE2EDuration="2m11.467172288s" podCreationTimestamp="2026-04-23 08:12:33 +0000 UTC" firstStartedPulling="2026-04-23 08:14:43.15378507 +0000 UTC m=+130.798569695" lastFinishedPulling="2026-04-23 08:14:44.092675832 +0000 UTC m=+131.737460468" observedRunningTime="2026-04-23 08:14:44.466247121 +0000 UTC m=+132.111031765" watchObservedRunningTime="2026-04-23 08:14:44.467172288 +0000 UTC m=+132.111956952" Apr 23 08:14:44.486926 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:44.486867 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2fh9j" podStartSLOduration=2.742600251 podStartE2EDuration="3.486845835s" podCreationTimestamp="2026-04-23 08:14:41 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.987332982 +0000 UTC m=+129.632117610" lastFinishedPulling="2026-04-23 08:14:42.731578567 +0000 UTC m=+130.376363194" observedRunningTime="2026-04-23 08:14:44.486728263 +0000 UTC m=+132.131512907" watchObservedRunningTime="2026-04-23 08:14:44.486845835 +0000 UTC m=+132.131630478" Apr 23 08:14:45.008600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.008571 2576 scope.go:117] "RemoveContainer" containerID="3a50dce06d09601ec342bf9b741d4eb0041add1e61284db4b5200598d4dc65cc" Apr 23 08:14:45.459060 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.459025 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c" exitCode=0 Apr 23 08:14:45.459217 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.459115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c"} Apr 23 08:14:45.460810 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.460793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:14:45.460937 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.460919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" event={"ID":"5c078d65-5482-4cf4-96a9-20d4ce24cf24","Type":"ContainerStarted","Data":"96217ff06e1e4f64d995e2ec399f2f94780a6a3accc866bc414887066071a3bb"} Apr 23 08:14:45.461619 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.461587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:45.501512 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.501454 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" podStartSLOduration=44.193084125 podStartE2EDuration="46.501438034s" podCreationTimestamp="2026-04-23 08:13:59 +0000 UTC" firstStartedPulling="2026-04-23 08:13:59.969294933 +0000 UTC m=+87.614079555" lastFinishedPulling="2026-04-23 08:14:02.277648827 +0000 UTC m=+89.922433464" observedRunningTime="2026-04-23 08:14:45.499759508 +0000 UTC m=+133.144544154" watchObservedRunningTime="2026-04-23 08:14:45.501438034 +0000 UTC m=+133.146222681" Apr 23 08:14:45.566321 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:45.566258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-dwbj6" Apr 23 08:14:46.070327 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.070299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f9457c4cc-84fth"] Apr 23 08:14:46.072500 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.072482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.074644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.074625 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 08:14:46.074770 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.074751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-42tqhqmomlphj\"" Apr 23 08:14:46.075057 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.075035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-przmh\"" Apr 23 08:14:46.075153 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.075105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:14:46.075545 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.075531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 08:14:46.075619 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.075604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 08:14:46.081561 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.081542 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f9457c4cc-84fth"] Apr 23 08:14:46.195408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-client-certs\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195593 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-metrics-server-audit-profiles\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195593 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e97294-08d4-4a90-a5a4-1df7d8169357-audit-log\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195708 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-tls\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195708 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195708 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwmw5\" (UniqueName: \"kubernetes.io/projected/70e97294-08d4-4a90-a5a4-1df7d8169357-kube-api-access-bwmw5\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.195859 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.195745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-client-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297244 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-client-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297394 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-client-certs\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297394 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-metrics-server-audit-profiles\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297394 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e97294-08d4-4a90-a5a4-1df7d8169357-audit-log\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297394 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-tls\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297394 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.297699 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.297418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwmw5\" (UniqueName: \"kubernetes.io/projected/70e97294-08d4-4a90-a5a4-1df7d8169357-kube-api-access-bwmw5\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.298253 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.298223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e97294-08d4-4a90-a5a4-1df7d8169357-audit-log\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.299819 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.298840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-metrics-server-audit-profiles\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.299819 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.298900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e97294-08d4-4a90-a5a4-1df7d8169357-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.303719 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.300414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-client-certs\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.303719 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.300568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-secret-metrics-server-tls\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.303719 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.300707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e97294-08d4-4a90-a5a4-1df7d8169357-client-ca-bundle\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.305261 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.305241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwmw5\" (UniqueName: \"kubernetes.io/projected/70e97294-08d4-4a90-a5a4-1df7d8169357-kube-api-access-bwmw5\") pod \"metrics-server-6f9457c4cc-84fth\" (UID: \"70e97294-08d4-4a90-a5a4-1df7d8169357\") " pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.382767 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.382735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:14:46.510081 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:46.510058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f9457c4cc-84fth"] Apr 23 08:14:46.511734 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:46.511707 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e97294_08d4_4a90_a5a4_1df7d8169357.slice/crio-783467e584a2e16da030dcada28a3cab5ee5b10e212ca612d6590e96bf7c7280 WatchSource:0}: Error finding container 783467e584a2e16da030dcada28a3cab5ee5b10e212ca612d6590e96bf7c7280: Status 404 returned error can't find the container with id 783467e584a2e16da030dcada28a3cab5ee5b10e212ca612d6590e96bf7c7280 Apr 23 08:14:47.470762 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.470696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e"} Apr 23 08:14:47.470762 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.470736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5"} Apr 23 08:14:47.470762 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.470751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4"} Apr 23 08:14:47.470762 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.470765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae"} Apr 23 08:14:47.471832 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.471787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" event={"ID":"70e97294-08d4-4a90-a5a4-1df7d8169357","Type":"ContainerStarted","Data":"783467e584a2e16da030dcada28a3cab5ee5b10e212ca612d6590e96bf7c7280"} Apr 23 08:14:47.976537 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.976506 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:14:47.979704 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.979683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:47.982216 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:14:47.982216 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:14:47.982461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982196 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:14:47.982461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:14:47.982461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:14:47.982707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982520 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:14:47.982707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:14:47.982707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-q94ts\"" Apr 23 08:14:47.982961 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:14:47.982961 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:14:47.982961 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.982960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:14:47.983232 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.983009 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:14:47.983232 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.983085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9ptno4f5oafb0\"" Apr 23 08:14:47.986108 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.986089 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:14:47.997576 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:47.997555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:14:48.114925 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.114878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115036 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.114945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115036 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.114990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115036 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115174 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115174 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115324 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115324 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115324 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdp4m\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115414 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115414 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115414 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115501 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115501 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115501 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.115589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.115532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216558 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdp4m\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.216847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.217250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.217250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.216978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.217250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.217004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.217250 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.217158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.218097 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.217764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.218252 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.218192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.220560 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.220476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.221697 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.220731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.221697 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.221210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.221697 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.221296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.221697 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.221660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.223223 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.223176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.223318 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.223219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.223398 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.223366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.223815 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.223792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.223923 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.223890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.224094 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.224069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.224233 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.224210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.225937 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.225894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.226033 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.226017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdp4m\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m\") pod \"prometheus-k8s-0\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.291372 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.291292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:14:48.433482 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.433447 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:14:48.477146 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.477112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4"} Apr 23 08:14:48.478500 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.478473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" event={"ID":"70e97294-08d4-4a90-a5a4-1df7d8169357","Type":"ContainerStarted","Data":"76e66baf7c191873dd2963c9d09fe54aff86043e1a6e8736043c711242e6e2e6"} Apr 23 08:14:48.480136 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:14:48.480112 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5fe44d_cda0_438a_a2f3_b01116ca9337.slice/crio-e33e5cb4872cad470e9cac5067c45ccd75b7a4e7b3364ed50bfc5d4aeaa269fe WatchSource:0}: Error finding container e33e5cb4872cad470e9cac5067c45ccd75b7a4e7b3364ed50bfc5d4aeaa269fe: Status 404 returned error can't find the container with id e33e5cb4872cad470e9cac5067c45ccd75b7a4e7b3364ed50bfc5d4aeaa269fe Apr 23 08:14:48.496542 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:48.496497 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" podStartSLOduration=0.914866444 podStartE2EDuration="2.496484524s" podCreationTimestamp="2026-04-23 08:14:46 +0000 UTC" firstStartedPulling="2026-04-23 08:14:46.513614466 +0000 UTC m=+134.158399088" lastFinishedPulling="2026-04-23 08:14:48.095232532 +0000 UTC m=+135.740017168" observedRunningTime="2026-04-23 08:14:48.494323914 +0000 UTC m=+136.139108557" watchObservedRunningTime="2026-04-23 08:14:48.496484524 +0000 UTC m=+136.141269172" Apr 23 08:14:49.482817 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:49.482783 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" exitCode=0 Apr 23 08:14:49.483224 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:49.482880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} Apr 23 08:14:49.483224 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:49.482934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"e33e5cb4872cad470e9cac5067c45ccd75b7a4e7b3364ed50bfc5d4aeaa269fe"} Apr 23 08:14:49.486021 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:49.485997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerStarted","Data":"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060"} Apr 23 08:14:49.537710 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:49.537666 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.798657274 podStartE2EDuration="7.537653205s" podCreationTimestamp="2026-04-23 08:14:42 +0000 UTC" firstStartedPulling="2026-04-23 08:14:43.777989726 +0000 UTC m=+131.422774349" lastFinishedPulling="2026-04-23 08:14:48.516985637 +0000 UTC m=+136.161770280" observedRunningTime="2026-04-23 08:14:49.534732424 +0000 UTC m=+137.179517068" watchObservedRunningTime="2026-04-23 08:14:49.537653205 +0000 UTC m=+137.182437848" Apr 23 08:14:52.503811 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:52.503730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} Apr 23 08:14:52.503811 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:52.503783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} Apr 23 08:14:54.414875 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.414840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-75b96b49cf-p7vvq" Apr 23 08:14:54.517093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.517063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} Apr 23 08:14:54.517093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.517096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} Apr 23 08:14:54.517296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.517106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} Apr 23 08:14:54.517296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.517114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerStarted","Data":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} Apr 23 08:14:54.545201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:54.545115 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.916006507 podStartE2EDuration="7.545097741s" podCreationTimestamp="2026-04-23 08:14:47 +0000 UTC" firstStartedPulling="2026-04-23 08:14:49.484096312 +0000 UTC m=+137.128880934" lastFinishedPulling="2026-04-23 08:14:54.113187547 +0000 UTC m=+141.757972168" observedRunningTime="2026-04-23 08:14:54.543564049 +0000 UTC m=+142.188348722" watchObservedRunningTime="2026-04-23 08:14:54.545097741 +0000 UTC m=+142.189882386" Apr 23 08:14:58.292509 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:14:58.292468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:15:06.384083 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:06.384053 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:15:06.384083 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:06.384092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:15:08.723260 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:15:08.723218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g69k6" podUID="9ca5f258-f8f2-45d5-8de6-67a2bd1028b3" Apr 23 08:15:08.735421 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:15:08.735379 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p8pm2" podUID="6d10c1ba-7b11-4a83-938c-04443e1047c2" Apr 23 08:15:09.558573 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:09.558544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:13.648706 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.648664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:13.649112 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.648762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:15:13.651044 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.651019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ca5f258-f8f2-45d5-8de6-67a2bd1028b3-metrics-tls\") pod \"dns-default-g69k6\" (UID: \"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3\") " pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:13.651162 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.651130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d10c1ba-7b11-4a83-938c-04443e1047c2-cert\") pod \"ingress-canary-p8pm2\" (UID: \"6d10c1ba-7b11-4a83-938c-04443e1047c2\") " pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:15:13.762175 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.762152 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:15:13.769449 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.769429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:13.885047 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:13.885022 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g69k6"] Apr 23 08:15:13.889355 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:15:13.889323 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca5f258_f8f2_45d5_8de6_67a2bd1028b3.slice/crio-66c7095b828cc33758e896782d20945ae5b99cc0a62efe192bae53e1f115895d WatchSource:0}: Error finding container 66c7095b828cc33758e896782d20945ae5b99cc0a62efe192bae53e1f115895d: Status 404 returned error can't find the container with id 66c7095b828cc33758e896782d20945ae5b99cc0a62efe192bae53e1f115895d Apr 23 08:15:14.572376 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:14.572343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g69k6" event={"ID":"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3","Type":"ContainerStarted","Data":"66c7095b828cc33758e896782d20945ae5b99cc0a62efe192bae53e1f115895d"} Apr 23 08:15:16.579204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:16.579165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g69k6" event={"ID":"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3","Type":"ContainerStarted","Data":"511374ac1445791319cff4166e02918b0d14e5d8cf762738309da02cb32b8a12"} Apr 23 08:15:16.579204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:16.579203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g69k6" event={"ID":"9ca5f258-f8f2-45d5-8de6-67a2bd1028b3","Type":"ContainerStarted","Data":"21cc335df67fafc3b66789267492c215147eee84cdf746749713d41f1e75e71c"} Apr 23 08:15:16.579606 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:16.579281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:16.596500 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:16.596450 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g69k6" podStartSLOduration=129.889588974 podStartE2EDuration="2m11.596436939s" podCreationTimestamp="2026-04-23 08:13:05 +0000 UTC" firstStartedPulling="2026-04-23 08:15:13.890729016 +0000 UTC m=+161.535513638" lastFinishedPulling="2026-04-23 08:15:15.597576978 +0000 UTC m=+163.242361603" observedRunningTime="2026-04-23 08:15:16.596306335 +0000 UTC m=+164.241090979" watchObservedRunningTime="2026-04-23 08:15:16.596436939 +0000 UTC m=+164.241221583" Apr 23 08:15:22.002643 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:22.002550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:15:22.005229 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:22.005212 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:15:22.013591 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:22.013570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8pm2" Apr 23 08:15:22.339447 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:22.339425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8pm2"] Apr 23 08:15:22.342051 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:15:22.342024 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d10c1ba_7b11_4a83_938c_04443e1047c2.slice/crio-071d9af3a3404b56ee7b2fc7cd571c5d85e15a08282f03e579ad5b733fd88fa3 WatchSource:0}: Error finding container 071d9af3a3404b56ee7b2fc7cd571c5d85e15a08282f03e579ad5b733fd88fa3: Status 404 returned error can't find the container with id 071d9af3a3404b56ee7b2fc7cd571c5d85e15a08282f03e579ad5b733fd88fa3 Apr 23 08:15:22.597575 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:22.597492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8pm2" event={"ID":"6d10c1ba-7b11-4a83-938c-04443e1047c2","Type":"ContainerStarted","Data":"071d9af3a3404b56ee7b2fc7cd571c5d85e15a08282f03e579ad5b733fd88fa3"} Apr 23 08:15:23.584086 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:23.584060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g69k6_9ca5f258-f8f2-45d5-8de6-67a2bd1028b3/dns/0.log" Apr 23 08:15:23.590943 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:23.590904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g69k6_9ca5f258-f8f2-45d5-8de6-67a2bd1028b3/kube-rbac-proxy/0.log" Apr 23 08:15:24.090513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:24.090435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkg5x_08aba02d-ea63-43f8-9e3d-409a65aa759d/dns-node-resolver/0.log" Apr 23 08:15:24.604524 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:24.604492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8pm2" event={"ID":"6d10c1ba-7b11-4a83-938c-04443e1047c2","Type":"ContainerStarted","Data":"2cacc07fbffa8d4ff3f6cae1b84f1bf0f43095a13f9fe688dd1a4d72c9451a25"} Apr 23 08:15:24.622377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:24.622329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p8pm2" podStartSLOduration=138.12448299 podStartE2EDuration="2m19.622316338s" podCreationTimestamp="2026-04-23 08:13:05 +0000 UTC" firstStartedPulling="2026-04-23 08:15:22.344161184 +0000 UTC m=+169.988945809" lastFinishedPulling="2026-04-23 08:15:23.841994528 +0000 UTC m=+171.486779157" observedRunningTime="2026-04-23 08:15:24.621195935 +0000 UTC m=+172.265980593" watchObservedRunningTime="2026-04-23 08:15:24.622316338 +0000 UTC m=+172.267100982" Apr 23 08:15:26.388577 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:26.388552 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:15:26.392287 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:26.392263 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f9457c4cc-84fth" Apr 23 08:15:26.584818 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:26.584792 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g69k6" Apr 23 08:15:29.620198 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:29.620162 2576 generic.go:358] "Generic (PLEG): container finished" podID="32fba226-f299-4daf-9b75-93ade820fb8b" containerID="2a780e892103eefae0b547bbcaca635f4a960470b8038537710a67bf16492f27" exitCode=0 Apr 23 08:15:29.620625 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:29.620236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" event={"ID":"32fba226-f299-4daf-9b75-93ade820fb8b","Type":"ContainerDied","Data":"2a780e892103eefae0b547bbcaca635f4a960470b8038537710a67bf16492f27"} Apr 23 08:15:29.620625 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:29.620562 2576 scope.go:117] "RemoveContainer" containerID="2a780e892103eefae0b547bbcaca635f4a960470b8038537710a67bf16492f27" Apr 23 08:15:30.624787 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:30.624756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-v67lr" event={"ID":"32fba226-f299-4daf-9b75-93ade820fb8b","Type":"ContainerStarted","Data":"cdb111c89b65b0ed32bae607a39f123a5a9e5d885be87721e1f0de8bc91fa42d"} Apr 23 08:15:37.646056 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:37.646020 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4d55644-7fda-4d49-b10d-7977a14862de" containerID="6cc783472ad4ea11967ca246190a4c26d43dedaa3bbada4fd8974a58b0a9d4f0" exitCode=0 Apr 23 08:15:37.646421 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:37.646095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" event={"ID":"e4d55644-7fda-4d49-b10d-7977a14862de","Type":"ContainerDied","Data":"6cc783472ad4ea11967ca246190a4c26d43dedaa3bbada4fd8974a58b0a9d4f0"} Apr 23 08:15:37.646508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:37.646444 2576 scope.go:117] "RemoveContainer" containerID="6cc783472ad4ea11967ca246190a4c26d43dedaa3bbada4fd8974a58b0a9d4f0" Apr 23 08:15:38.650462 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:38.650428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g944w" event={"ID":"e4d55644-7fda-4d49-b10d-7977a14862de","Type":"ContainerStarted","Data":"93462463c6cd3a501749b271f3acddb79f57d2266f7c202c8aa71666bb1eb007"} Apr 23 08:15:48.291821 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:48.291784 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:15:48.311978 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:48.311955 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:15:48.693377 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:15:48.693351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:02.052009 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.051969 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:02.052603 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052540 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="alertmanager" containerID="cri-o://bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" gracePeriod=120 Apr 23 08:16:02.052603 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052570 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-metric" containerID="cri-o://bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" gracePeriod=120 Apr 23 08:16:02.052812 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052614 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="prom-label-proxy" containerID="cri-o://1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" gracePeriod=120 Apr 23 08:16:02.052812 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052625 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-web" containerID="cri-o://15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" gracePeriod=120 Apr 23 08:16:02.052812 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052664 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy" containerID="cri-o://b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" gracePeriod=120 Apr 23 08:16:02.052812 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.052661 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="config-reloader" containerID="cri-o://017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" gracePeriod=120 Apr 23 08:16:02.720240 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720204 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" exitCode=0 Apr 23 08:16:02.720240 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720230 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" exitCode=0 Apr 23 08:16:02.720240 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720237 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" exitCode=0 Apr 23 08:16:02.720240 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720243 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" exitCode=0 Apr 23 08:16:02.720510 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060"} Apr 23 08:16:02.720510 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e"} Apr 23 08:16:02.720510 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4"} Apr 23 08:16:02.720510 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:02.720304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae"} Apr 23 08:16:03.385718 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.385696 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.473893 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.473893 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.473893 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473871 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473897 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vwc\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473962 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.473978 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474005 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474044 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474110 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474141 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474189 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474590 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474206 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume\") pod \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\" (UID: \"d7aa592b-ea0f-484f-ac85-c57aae7ccce8\") " Apr 23 08:16:03.474590 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474286 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:03.474590 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.474505 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.476605 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.476575 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out" (OuterVolumeSpecName: "config-out") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:16:03.476852 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.476817 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc" (OuterVolumeSpecName: "kube-api-access-49vwc") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "kube-api-access-49vwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:16:03.476852 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.476842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.477273 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.477178 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:16:03.477273 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.477183 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:03.478136 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.478110 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.478136 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.478124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:16:03.478435 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.478404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.478827 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.478807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.479105 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.479084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.481282 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.481259 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.487642 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.487620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config" (OuterVolumeSpecName: "web-config") pod "d7aa592b-ea0f-484f-ac85-c57aae7ccce8" (UID: "d7aa592b-ea0f-484f-ac85-c57aae7ccce8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:03.575321 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575292 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-web-config\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575321 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575316 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575321 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575326 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-out\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575335 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575345 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-metrics-client-ca\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575357 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-main-tls\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575366 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-alertmanager-main-db\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575374 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575384 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-config-volume\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575392 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-tls-assets\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575400 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-cluster-tls-config\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.575508 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.575408 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49vwc\" (UniqueName: \"kubernetes.io/projected/d7aa592b-ea0f-484f-ac85-c57aae7ccce8-kube-api-access-49vwc\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:03.725933 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725834 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" exitCode=0 Apr 23 08:16:03.725933 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725859 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerID="15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" exitCode=0 Apr 23 08:16:03.725933 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4"} Apr 23 08:16:03.725933 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5"} Apr 23 08:16:03.725933 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7aa592b-ea0f-484f-ac85-c57aae7ccce8","Type":"ContainerDied","Data":"05ffc45f4db9ec7eb3d9c9f18d460d438fbe4494ae1dd99faf71f1df5d07b55b"} Apr 23 08:16:03.726289 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725944 2576 scope.go:117] "RemoveContainer" containerID="1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" Apr 23 08:16:03.726289 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.725976 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.733525 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.733510 2576 scope.go:117] "RemoveContainer" containerID="bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" Apr 23 08:16:03.740341 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.740327 2576 scope.go:117] "RemoveContainer" containerID="b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" Apr 23 08:16:03.746549 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.746532 2576 scope.go:117] "RemoveContainer" containerID="15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" Apr 23 08:16:03.749076 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.749055 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:03.753133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.753111 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:03.753824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.753805 2576 scope.go:117] "RemoveContainer" containerID="017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" Apr 23 08:16:03.760507 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.760490 2576 scope.go:117] "RemoveContainer" containerID="bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" Apr 23 08:16:03.766936 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.766921 2576 scope.go:117] "RemoveContainer" containerID="2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c" Apr 23 08:16:03.773230 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.773213 2576 scope.go:117] "RemoveContainer" containerID="1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" Apr 23 08:16:03.773492 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.773473 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060\": container with ID starting with 1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060 not found: ID does not exist" containerID="1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" Apr 23 08:16:03.773546 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.773501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060"} err="failed to get container status \"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060\": rpc error: code = NotFound desc = could not find container \"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060\": container with ID starting with 1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060 not found: ID does not exist" Apr 23 08:16:03.773546 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.773533 2576 scope.go:117] "RemoveContainer" containerID="bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" Apr 23 08:16:03.773773 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.773754 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4\": container with ID starting with bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4 not found: ID does not exist" containerID="bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" Apr 23 08:16:03.773824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.773779 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4"} err="failed to get container status \"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4\": rpc error: code = NotFound desc = could not find container \"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4\": container with ID starting with bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4 not found: ID does not exist" Apr 23 08:16:03.773824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.773799 2576 scope.go:117] "RemoveContainer" containerID="b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" Apr 23 08:16:03.774047 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.774033 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e\": container with ID starting with b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e not found: ID does not exist" containerID="b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" Apr 23 08:16:03.774089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774052 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e"} err="failed to get container status \"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e\": rpc error: code = NotFound desc = could not find container \"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e\": container with ID starting with b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e not found: ID does not exist" Apr 23 08:16:03.774089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774067 2576 scope.go:117] "RemoveContainer" containerID="15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" Apr 23 08:16:03.774279 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.774262 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5\": container with ID starting with 15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5 not found: ID does not exist" containerID="15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" Apr 23 08:16:03.774323 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774283 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5"} err="failed to get container status \"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5\": rpc error: code = NotFound desc = could not find container \"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5\": container with ID starting with 15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5 not found: ID does not exist" Apr 23 08:16:03.774323 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774297 2576 scope.go:117] "RemoveContainer" containerID="017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" Apr 23 08:16:03.774493 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.774477 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4\": container with ID starting with 017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4 not found: ID does not exist" containerID="017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" Apr 23 08:16:03.774533 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774497 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4"} err="failed to get container status \"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4\": rpc error: code = NotFound desc = could not find container \"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4\": container with ID starting with 017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4 not found: ID does not exist" Apr 23 08:16:03.774533 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774511 2576 scope.go:117] "RemoveContainer" containerID="bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" Apr 23 08:16:03.774726 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.774709 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae\": container with ID starting with bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae not found: ID does not exist" containerID="bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" Apr 23 08:16:03.774774 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774731 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae"} err="failed to get container status \"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae\": rpc error: code = NotFound desc = could not find container \"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae\": container with ID starting with bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae not found: ID does not exist" Apr 23 08:16:03.774774 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774744 2576 scope.go:117] "RemoveContainer" containerID="2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c" Apr 23 08:16:03.774958 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:03.774940 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c\": container with ID starting with 2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c not found: ID does not exist" containerID="2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c" Apr 23 08:16:03.775015 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774963 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c"} err="failed to get container status \"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c\": rpc error: code = NotFound desc = could not find container \"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c\": container with ID starting with 2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c not found: ID does not exist" Apr 23 08:16:03.775015 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.774981 2576 scope.go:117] "RemoveContainer" containerID="1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060" Apr 23 08:16:03.775196 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775174 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060"} err="failed to get container status \"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060\": rpc error: code = NotFound desc = could not find container \"1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060\": container with ID starting with 1ed6daa5d108407b3768f7958c85c4f8a43ff2c3b79c8d15d158bbe3d2ac4060 not found: ID does not exist" Apr 23 08:16:03.775239 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775198 2576 scope.go:117] "RemoveContainer" containerID="bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4" Apr 23 08:16:03.775416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775400 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4"} err="failed to get container status \"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4\": rpc error: code = NotFound desc = could not find container \"bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4\": container with ID starting with bfdbec3fc5669133e865d5409239007926de57c5284bed4130744e615d5186b4 not found: ID does not exist" Apr 23 08:16:03.775456 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775416 2576 scope.go:117] "RemoveContainer" containerID="b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e" Apr 23 08:16:03.775614 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775597 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e"} err="failed to get container status \"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e\": rpc error: code = NotFound desc = could not find container \"b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e\": container with ID starting with b9094f663758c4847ab6d25a6d7207d6550f4ef78e22bd8e6308a02bd141889e not found: ID does not exist" Apr 23 08:16:03.775665 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775613 2576 scope.go:117] "RemoveContainer" containerID="15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5" Apr 23 08:16:03.775811 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775795 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5"} err="failed to get container status \"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5\": rpc error: code = NotFound desc = could not find container \"15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5\": container with ID starting with 15221b5f4f82923557c0500be873c3093c3d5a85d8bb8c988eba71fc6ae343a5 not found: ID does not exist" Apr 23 08:16:03.775853 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.775813 2576 scope.go:117] "RemoveContainer" containerID="017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4" Apr 23 08:16:03.776024 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.776008 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4"} err="failed to get container status \"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4\": rpc error: code = NotFound desc = could not find container \"017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4\": container with ID starting with 017aadff8a23a34fa96dc017747db5369570ff51f49cd9eb8e86d3d662c096e4 not found: ID does not exist" Apr 23 08:16:03.776070 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.776024 2576 scope.go:117] "RemoveContainer" containerID="bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae" Apr 23 08:16:03.776242 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.776224 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae"} err="failed to get container status \"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae\": rpc error: code = NotFound desc = could not find container \"bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae\": container with ID starting with bb8a5f10e1159ec5d6366ab3865a72ab43fb51653694485f9a2604955ed0a5ae not found: ID does not exist" Apr 23 08:16:03.776314 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.776246 2576 scope.go:117] "RemoveContainer" containerID="2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c" Apr 23 08:16:03.776445 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.776430 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c"} err="failed to get container status \"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c\": rpc error: code = NotFound desc = could not find container \"2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c\": container with ID starting with 2ae3564cacd645e4c1505b11da76015e7bd97ad5419af62231dda532629fb41c not found: ID does not exist" Apr 23 08:16:03.779754 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.779735 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:03.780036 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780023 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-web" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780038 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-web" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="prom-label-proxy" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780053 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="prom-label-proxy" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780065 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-metric" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780070 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-metric" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780080 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780085 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy" Apr 23 08:16:03.780093 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780095 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="config-reloader" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780101 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="config-reloader" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780107 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="init-config-reloader" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780112 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="init-config-reloader" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780119 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="alertmanager" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780124 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="alertmanager" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780168 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="prom-label-proxy" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780177 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="alertmanager" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780182 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-metric" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780190 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="config-reloader" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780196 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy" Apr 23 08:16:03.780339 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.780204 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" containerName="kube-rbac-proxy-web" Apr 23 08:16:03.783368 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.783355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.785541 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785526 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:16:03.785625 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785528 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:16:03.785682 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785646 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:16:03.785682 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:16:03.785773 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:16:03.785983 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fb9ht\"" Apr 23 08:16:03.786047 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.785974 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:16:03.786264 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.786249 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:16:03.786334 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.786275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:16:03.796534 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.796513 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:16:03.797824 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.797802 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:03.879123 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-out\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879267 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879267 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879267 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879267 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879409 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzw2\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-kube-api-access-cbzw2\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879409 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879409 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-web-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879409 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879523 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879523 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879523 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.879523 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.879501 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.980895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.980895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.980895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.980895 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-out\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.980988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzw2\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-kube-api-access-cbzw2\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981184 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-web-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981574 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.981574 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.981445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.982312 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.982283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.982426 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.982382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa969d9a-15a9-4698-bdc0-8d556080c4c7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984294 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.983995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-out\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984294 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984294 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984294 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984294 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984554 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.984867 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.984851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.985826 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.985812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa969d9a-15a9-4698-bdc0-8d556080c4c7-web-config\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:03.988925 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:03.988890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzw2\" (UniqueName: \"kubernetes.io/projected/aa969d9a-15a9-4698-bdc0-8d556080c4c7-kube-api-access-cbzw2\") pod \"alertmanager-main-0\" (UID: \"aa969d9a-15a9-4698-bdc0-8d556080c4c7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:04.096902 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:04.096869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:16:04.236127 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:04.236100 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:16:04.237716 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:16:04.237691 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa969d9a_15a9_4698_bdc0_8d556080c4c7.slice/crio-35b995a2e28afdbaa3c06e0a267c365ed0f3c09f3bb2bcf7570df4f0537785b5 WatchSource:0}: Error finding container 35b995a2e28afdbaa3c06e0a267c365ed0f3c09f3bb2bcf7570df4f0537785b5: Status 404 returned error can't find the container with id 35b995a2e28afdbaa3c06e0a267c365ed0f3c09f3bb2bcf7570df4f0537785b5 Apr 23 08:16:04.730868 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:04.730831 2576 generic.go:358] "Generic (PLEG): container finished" podID="aa969d9a-15a9-4698-bdc0-8d556080c4c7" containerID="dc4ee5442a635f2bc936e0d29336db05668a24d669769e358d07cd3b57b35896" exitCode=0 Apr 23 08:16:04.731332 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:04.730937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerDied","Data":"dc4ee5442a635f2bc936e0d29336db05668a24d669769e358d07cd3b57b35896"} Apr 23 08:16:04.731332 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:04.730977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"35b995a2e28afdbaa3c06e0a267c365ed0f3c09f3bb2bcf7570df4f0537785b5"} Apr 23 08:16:05.006481 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.006456 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7aa592b-ea0f-484f-ac85-c57aae7ccce8" path="/var/lib/kubelet/pods/d7aa592b-ea0f-484f-ac85-c57aae7ccce8/volumes" Apr 23 08:16:05.736461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"8a420733ef7bb3efc7339bddc8decce4f424d595a641f8decafd47741839b3c4"} Apr 23 08:16:05.736461 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"eb813e11c7e3044560464c63c091c303491bdb9adf813b5e67cd7f310bcb246f"} Apr 23 08:16:05.736882 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"7e30449a550fd750e4c45270191de171316e9fb35b105d9f11f7b01fa4bf01b6"} Apr 23 08:16:05.736882 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"bcbeec3347fec0c8270f3c9c2d9d4585c65b8838edcbb82a8dcbc47f858c9a5b"} Apr 23 08:16:05.736882 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"157e9d0a51d6d648deda823a8e31faf20c9a2e0676dba7bc469da13adbb3963c"} Apr 23 08:16:05.736882 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.736505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa969d9a-15a9-4698-bdc0-8d556080c4c7","Type":"ContainerStarted","Data":"9ab331bd6949df6524efda9bf3e980b423e3629a96d77c7b92651b98299885df"} Apr 23 08:16:05.764552 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:05.764489 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.764469152 podStartE2EDuration="2.764469152s" podCreationTimestamp="2026-04-23 08:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:16:05.762467719 +0000 UTC m=+213.407252366" watchObservedRunningTime="2026-04-23 08:16:05.764469152 +0000 UTC m=+213.409253797" Apr 23 08:16:06.337683 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.337648 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:06.338296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338244 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="prometheus" containerID="cri-o://ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" gracePeriod=600 Apr 23 08:16:06.338419 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338292 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-web" containerID="cri-o://293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" gracePeriod=600 Apr 23 08:16:06.338419 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338300 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-thanos" containerID="cri-o://601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" gracePeriod=600 Apr 23 08:16:06.338419 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338268 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="thanos-sidecar" containerID="cri-o://b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" gracePeriod=600 Apr 23 08:16:06.338419 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338267 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="config-reloader" containerID="cri-o://31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" gracePeriod=600 Apr 23 08:16:06.338624 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.338253 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy" containerID="cri-o://b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" gracePeriod=600 Apr 23 08:16:06.580653 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.580629 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.703061 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703247 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703085 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdp4m\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703247 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703109 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703247 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703247 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703247 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703302 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703359 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703404 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703487 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703561 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703598 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703626 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703674 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:06.703748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703701 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle\") pod \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\" (UID: \"9c5fe44d-cda0-438a-a2f3-b01116ca9337\") " Apr 23 08:16:06.704160 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.703989 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.704381 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.704357 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:06.704646 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.704617 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:06.705439 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.705358 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:06.705571 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.705532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m" (OuterVolumeSpecName: "kube-api-access-kdp4m") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "kube-api-access-kdp4m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:16:06.706111 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.706081 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:16:06.706207 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.706111 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.706207 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.706131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:16:06.707007 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.706981 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:16:06.707513 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.707487 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.707644 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.707620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.707748 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.707682 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config" (OuterVolumeSpecName: "config") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.708000 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.707969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.708087 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.708003 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.708087 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.708031 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.708087 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.708048 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.708239 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.708139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out" (OuterVolumeSpecName: "config-out") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:16:06.717432 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.717411 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config" (OuterVolumeSpecName: "web-config") pod "9c5fe44d-cda0-438a-a2f3-b01116ca9337" (UID: "9c5fe44d-cda0-438a-a2f3-b01116ca9337"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:16:06.742077 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742052 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" exitCode=0 Apr 23 08:16:06.742077 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742072 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" exitCode=0 Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742080 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" exitCode=0 Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742087 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" exitCode=0 Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742093 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" exitCode=0 Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742101 2576 generic.go:358] "Generic (PLEG): container finished" podID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" exitCode=0 Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742163 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742213 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c5fe44d-cda0-438a-a2f3-b01116ca9337","Type":"ContainerDied","Data":"e33e5cb4872cad470e9cac5067c45ccd75b7a4e7b3364ed50bfc5d4aeaa269fe"} Apr 23 08:16:06.742416 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.742252 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.751274 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.751257 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.757496 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.757478 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.763507 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.763489 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.766721 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.766694 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:06.770427 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.770409 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.770690 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.770670 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:06.776408 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.776392 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.782970 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.782954 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.788852 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.788837 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.789108 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.789089 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.789190 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789114 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.789190 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789133 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.789359 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.789343 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.789401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789367 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.789401 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789386 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.789590 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.789574 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.789630 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789595 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.789630 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789610 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.789811 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.789796 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.789846 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789815 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.789846 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.789828 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.790058 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.790042 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.790097 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790062 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.790097 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790075 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.790262 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.790249 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.790297 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790266 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.790297 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790278 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.790508 ip-10-0-139-180 kubenswrapper[2576]: E0423 08:16:06.790493 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.790551 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790511 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.790551 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790523 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.790732 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790716 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.790781 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790732 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.790966 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790946 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.791034 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.790966 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.791167 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791150 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.791213 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791167 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.791342 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791328 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.791382 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791342 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.791522 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791509 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.791563 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791523 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.791721 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791704 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.791760 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791722 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.791927 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791898 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.791977 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.791928 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.792117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.792156 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792117 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.792344 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792295 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.792344 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792311 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.792555 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792523 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.792555 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792543 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.792853 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792781 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.792853 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.792803 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.793066 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793049 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.793113 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793067 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.793345 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793317 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.793345 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793335 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.793599 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793578 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.793721 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793600 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.793888 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793871 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.793974 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.793890 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.794129 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794111 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.794129 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794130 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.794372 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794351 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.794442 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794373 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.794514 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794497 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:06.794616 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794575 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.794616 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794595 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.794842 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794821 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794843 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-web" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794863 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-web" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794877 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-thanos" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794884 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-thanos" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794892 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="prometheus" Apr 23 08:16:06.794889 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794898 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="prometheus" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794904 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="config-reloader" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794844 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794927 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="config-reloader" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794944 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="init-config-reloader" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794951 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="init-config-reloader" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794966 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="thanos-sidecar" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794971 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="thanos-sidecar" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794976 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.794981 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795060 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="prometheus" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795072 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-web" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795095 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="config-reloader" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795105 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="kube-rbac-proxy-thanos" Apr 23 08:16:06.795117 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795114 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" containerName="thanos-sidecar" Apr 23 08:16:06.795600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795147 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.795600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795167 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.795600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795382 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.795600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795400 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.795720 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795601 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.795720 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795620 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.795789 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795764 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.795789 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795781 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.795951 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795937 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.795998 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.795951 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.796133 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796114 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.796204 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796136 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.796333 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796317 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.796431 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796334 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.796534 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796518 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.796534 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796533 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.796738 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796719 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.796784 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796738 2576 scope.go:117] "RemoveContainer" containerID="601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377" Apr 23 08:16:06.796987 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796966 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377"} err="failed to get container status \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": rpc error: code = NotFound desc = could not find container \"601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377\": container with ID starting with 601d1aaa4108ffeb211c05bb03838a75cf88f3ea6c446a9158ba0d0afe07f377 not found: ID does not exist" Apr 23 08:16:06.797075 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.796989 2576 scope.go:117] "RemoveContainer" containerID="b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda" Apr 23 08:16:06.797202 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797185 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda"} err="failed to get container status \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": rpc error: code = NotFound desc = could not find container \"b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda\": container with ID starting with b22763a55318a1fe8a9d2709d7c8d30b68c9ab2a612928c05e51e41d50d50fda not found: ID does not exist" Apr 23 08:16:06.797242 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797202 2576 scope.go:117] "RemoveContainer" containerID="293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f" Apr 23 08:16:06.797422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797407 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f"} err="failed to get container status \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": rpc error: code = NotFound desc = could not find container \"293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f\": container with ID starting with 293844007c4cd2f83fb09baeef5e33077ba7d92d6aa21320de090522b50f478f not found: ID does not exist" Apr 23 08:16:06.797422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797422 2576 scope.go:117] "RemoveContainer" containerID="b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b" Apr 23 08:16:06.797634 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797615 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b"} err="failed to get container status \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": rpc error: code = NotFound desc = could not find container \"b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b\": container with ID starting with b753bab5600015b1d8bacb9710895b45aaba6f38c7db74778ba6887d48e65c1b not found: ID does not exist" Apr 23 08:16:06.797677 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797634 2576 scope.go:117] "RemoveContainer" containerID="31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b" Apr 23 08:16:06.797803 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797788 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b"} err="failed to get container status \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": rpc error: code = NotFound desc = could not find container \"31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b\": container with ID starting with 31f130439b3e50fdba82f6b627366768ace418f750919a3b7062fa4ebbd9b24b not found: ID does not exist" Apr 23 08:16:06.797839 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.797803 2576 scope.go:117] "RemoveContainer" containerID="ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f" Apr 23 08:16:06.798018 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.798000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f"} err="failed to get container status \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": rpc error: code = NotFound desc = could not find container \"ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f\": container with ID starting with ced20db922a3fb73796802a13d5aca80c737baf03a46654c0fcb3aea7444d82f not found: ID does not exist" Apr 23 08:16:06.798064 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.798020 2576 scope.go:117] "RemoveContainer" containerID="0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a" Apr 23 08:16:06.798218 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.798201 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a"} err="failed to get container status \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": rpc error: code = NotFound desc = could not find container \"0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a\": container with ID starting with 0170dab5ad73b77bd1965bf7c82b9e2f972847895dbd00659ea57247caf4773a not found: ID does not exist" Apr 23 08:16:06.799244 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.799231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.801427 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:16:06.801533 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9ptno4f5oafb0\"" Apr 23 08:16:06.801593 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:16:06.801794 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:16:06.801901 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:16:06.801901 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:16:06.802035 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.801975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:16:06.802089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.802073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:16:06.802127 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.802113 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:16:06.802296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.802275 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:16:06.802296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.802294 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-q94ts\"" Apr 23 08:16:06.802443 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.802388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:16:06.804610 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.804591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:16:06.805321 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805302 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805338 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805355 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-kube-rbac-proxy\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805371 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config-out\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805385 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-metrics-client-ca\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805422 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805406 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-metrics-client-certs\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805422 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805439 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805460 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdp4m\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-kube-api-access-kdp4m\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805474 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c5fe44d-cda0-438a-a2f3-b01116ca9337-tls-assets\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805489 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805503 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-config\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805516 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805537 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805551 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c5fe44d-cda0-438a-a2f3-b01116ca9337-prometheus-k8s-db\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805564 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-web-config\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.805672 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.805579 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c5fe44d-cda0-438a-a2f3-b01116ca9337-secret-grpc-tls\") on node \"ip-10-0-139-180.ec2.internal\" DevicePath \"\"" Apr 23 08:16:06.807459 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.807441 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:16:06.810200 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.810178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:06.906209 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906305 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906486 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906544 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddqg\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-kube-api-access-zddqg\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906648 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:06.906878 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:06.906676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.005810 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.005739 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5fe44d-cda0-438a-a2f3-b01116ca9337" path="/var/lib/kubelet/pods/9c5fe44d-cda0-438a-a2f3-b01116ca9337/volumes" Apr 23 08:16:07.007251 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007296 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007360 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007360 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007504 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.007589 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zddqg\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-kube-api-access-zddqg\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.008082 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.007924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.009749 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.008122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.009749 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.008545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.009749 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.009086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.009749 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.009547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.010031 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.009797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.010600 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.010579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.010708 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.010594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.010788 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.010758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.011165 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.011124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.011323 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.011283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.011707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.011673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4a0e9cc-1d1e-485d-ad8d-b54647260373-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.012244 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.012219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.012386 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.012361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.012973 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.012840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.012973 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.012850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.012973 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.012968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4a0e9cc-1d1e-485d-ad8d-b54647260373-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.013707 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.013690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4a0e9cc-1d1e-485d-ad8d-b54647260373-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.017900 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.017883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddqg\" (UniqueName: \"kubernetes.io/projected/e4a0e9cc-1d1e-485d-ad8d-b54647260373-kube-api-access-zddqg\") pod \"prometheus-k8s-0\" (UID: \"e4a0e9cc-1d1e-485d-ad8d-b54647260373\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.109263 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.109238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:16:07.249775 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.249619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:16:07.252290 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:16:07.252264 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a0e9cc_1d1e_485d_ad8d_b54647260373.slice/crio-371ee2167cddb22e6cffdcf6362cde344ae7ed5a36847546e730b38b5e45aa5c WatchSource:0}: Error finding container 371ee2167cddb22e6cffdcf6362cde344ae7ed5a36847546e730b38b5e45aa5c: Status 404 returned error can't find the container with id 371ee2167cddb22e6cffdcf6362cde344ae7ed5a36847546e730b38b5e45aa5c Apr 23 08:16:07.746818 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.746789 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4a0e9cc-1d1e-485d-ad8d-b54647260373" containerID="3e9e3087817fcf50dcca373f51da7fe28574adfc86d44a8f18c506403ac2fa6b" exitCode=0 Apr 23 08:16:07.747316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.746844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerDied","Data":"3e9e3087817fcf50dcca373f51da7fe28574adfc86d44a8f18c506403ac2fa6b"} Apr 23 08:16:07.747316 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:07.746867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"371ee2167cddb22e6cffdcf6362cde344ae7ed5a36847546e730b38b5e45aa5c"} Apr 23 08:16:08.752045 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"36a00890d3ccd439a1b65a2716f3e6897b60f1486a952d0eae562fa989f30d7a"} Apr 23 08:16:08.752045 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"909404e3e454fb4904721695d9236694fe7db8d4fc5b3d98e5f7f4e81484b1fe"} Apr 23 08:16:08.752444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"06a7d253b9a3d1c9ed05b7beeb0e0882d06088acd1d7846c011a7bbb2fcd5279"} Apr 23 08:16:08.752444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"bac418ad1d93dd83978f4ced6b1ebd9686fe329bf05e55a2fffdee18dd3d2cd0"} Apr 23 08:16:08.752444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"5f182ccdf18d1327b5a951c29e1bb5e88bd16da5a1b40538c072faf853f4172f"} Apr 23 08:16:08.752444 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.752093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4a0e9cc-1d1e-485d-ad8d-b54647260373","Type":"ContainerStarted","Data":"3971913c841b33e0a89cab33b50b54b80277a3e6e7e52737b9bf54f9eacf103a"} Apr 23 08:16:08.780132 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:08.780084 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.780070469 podStartE2EDuration="2.780070469s" podCreationTimestamp="2026-04-23 08:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:16:08.778577054 +0000 UTC m=+216.423361727" watchObservedRunningTime="2026-04-23 08:16:08.780070469 +0000 UTC m=+216.424855112" Apr 23 08:16:12.110030 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:16:12.109988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:07.110153 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:07.110076 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:07.125285 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:07.125260 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:07.939201 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:07.939173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:17:32.885993 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:32.885964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:17:32.886816 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:32.886793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:17:32.895862 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:17:32.895842 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:19:23.112936 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:23.112879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vdn4h_d194f1f2-2b7e-468a-86d7-142892eaac07/global-pull-secret-syncer/0.log" Apr 23 08:19:23.238664 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:23.238633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c7k5b_9fb79acc-affd-4077-a7a0-e13654094fad/konnectivity-agent/0.log" Apr 23 08:19:23.288769 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:23.288745 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-180.ec2.internal_5c11cee49b20d4343744d29b36ea3100/haproxy/0.log" Apr 23 08:19:26.321385 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.321360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/alertmanager/0.log" Apr 23 08:19:26.346540 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.346519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/config-reloader/0.log" Apr 23 08:19:26.370317 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.370295 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/kube-rbac-proxy-web/0.log" Apr 23 08:19:26.390695 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.390673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/kube-rbac-proxy/0.log" Apr 23 08:19:26.412132 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.412107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/kube-rbac-proxy-metric/0.log" Apr 23 08:19:26.434171 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.434140 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/prom-label-proxy/0.log" Apr 23 08:19:26.455576 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.455536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa969d9a-15a9-4698-bdc0-8d556080c4c7/init-config-reloader/0.log" Apr 23 08:19:26.587258 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.587186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f9457c4cc-84fth_70e97294-08d4-4a90-a5a4-1df7d8169357/metrics-server/0.log" Apr 23 08:19:26.636429 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.636401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2fh9j_04db17f1-1934-4280-804c-b2639a712354/node-exporter/0.log" Apr 23 08:19:26.655219 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.655194 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2fh9j_04db17f1-1934-4280-804c-b2639a712354/kube-rbac-proxy/0.log" Apr 23 08:19:26.680188 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.680165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2fh9j_04db17f1-1934-4280-804c-b2639a712354/init-textfile/0.log" Apr 23 08:19:26.925003 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.924978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/prometheus/0.log" Apr 23 08:19:26.944211 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.944183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/config-reloader/0.log" Apr 23 08:19:26.965574 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.965554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/thanos-sidecar/0.log" Apr 23 08:19:26.987755 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:26.987738 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/kube-rbac-proxy-web/0.log" Apr 23 08:19:27.009080 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:27.009051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/kube-rbac-proxy/0.log" Apr 23 08:19:27.033665 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:27.033644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/kube-rbac-proxy-thanos/0.log" Apr 23 08:19:27.053168 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:27.053150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e4a0e9cc-1d1e-485d-ad8d-b54647260373/init-config-reloader/0.log" Apr 23 08:19:27.128153 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:27.128128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bkk8b_4bd5b857-ad33-4875-93dc-f093e035eac7/prometheus-operator-admission-webhook/0.log" Apr 23 08:19:28.378457 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:28.378386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-sdmb6_0bf9e2c3-8864-4eb2-8cf6-02d901f1f6a2/networking-console-plugin/0.log" Apr 23 08:19:28.763089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:28.762970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/2.log" Apr 23 08:19:28.766784 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:28.766761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dwbj6_5c078d65-5482-4cf4-96a9-20d4ce24cf24/console-operator/3.log" Apr 23 08:19:29.454664 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.454636 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj"] Apr 23 08:19:29.458127 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.458106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.460904 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.460885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"kube-root-ca.crt\"" Apr 23 08:19:29.461016 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.460904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"openshift-service-ca.crt\"" Apr 23 08:19:29.461661 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.461640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x67hq\"/\"default-dockercfg-kgww6\"" Apr 23 08:19:29.467579 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.467560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj"] Apr 23 08:19:29.477957 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.477936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-4528n_b0e306ba-d423-4c00-810b-cb7950b66fb6/volume-data-source-validator/0.log" Apr 23 08:19:29.570249 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.570221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-proc\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.570400 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.570276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-podres\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.570400 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.570357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-sys\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.570400 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.570381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxm2\" (UniqueName: \"kubernetes.io/projected/74440f53-70ed-4233-bc50-09eaf420b4f9-kube-api-access-zqxm2\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.570516 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.570403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-lib-modules\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671396 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-sys\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671396 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxm2\" (UniqueName: \"kubernetes.io/projected/74440f53-70ed-4233-bc50-09eaf420b4f9-kube-api-access-zqxm2\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-lib-modules\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-proc\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-sys\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-podres\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-podres\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-proc\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.671632 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.671626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74440f53-70ed-4233-bc50-09eaf420b4f9-lib-modules\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.679074 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.679048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxm2\" (UniqueName: \"kubernetes.io/projected/74440f53-70ed-4233-bc50-09eaf420b4f9-kube-api-access-zqxm2\") pod \"perf-node-gather-daemonset-lvjwj\" (UID: \"74440f53-70ed-4233-bc50-09eaf420b4f9\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.768207 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.768116 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:29.886724 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.886700 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj"] Apr 23 08:19:29.889463 ip-10-0-139-180 kubenswrapper[2576]: W0423 08:19:29.889436 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod74440f53_70ed_4233_bc50_09eaf420b4f9.slice/crio-8a4adff15375cd2cf9cedbc34d30593caa9c541bad1c68fea813e4b9986d57c4 WatchSource:0}: Error finding container 8a4adff15375cd2cf9cedbc34d30593caa9c541bad1c68fea813e4b9986d57c4: Status 404 returned error can't find the container with id 8a4adff15375cd2cf9cedbc34d30593caa9c541bad1c68fea813e4b9986d57c4 Apr 23 08:19:29.891357 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:29.891338 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:19:30.061784 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.061713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g69k6_9ca5f258-f8f2-45d5-8de6-67a2bd1028b3/dns/0.log" Apr 23 08:19:30.080389 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.080369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g69k6_9ca5f258-f8f2-45d5-8de6-67a2bd1028b3/kube-rbac-proxy/0.log" Apr 23 08:19:30.229012 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.228985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkg5x_08aba02d-ea63-43f8-9e3d-409a65aa759d/dns-node-resolver/0.log" Apr 23 08:19:30.317164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.317080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" event={"ID":"74440f53-70ed-4233-bc50-09eaf420b4f9","Type":"ContainerStarted","Data":"c1b9a3300e9af345a27c072aca7a2c92b7e1e56abf84180df2cd25822f8022b7"} Apr 23 08:19:30.317164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.317118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" event={"ID":"74440f53-70ed-4233-bc50-09eaf420b4f9","Type":"ContainerStarted","Data":"8a4adff15375cd2cf9cedbc34d30593caa9c541bad1c68fea813e4b9986d57c4"} Apr 23 08:19:30.317164 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.317153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:30.354393 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.354346 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" podStartSLOduration=1.354332425 podStartE2EDuration="1.354332425s" podCreationTimestamp="2026-04-23 08:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:19:30.353514621 +0000 UTC m=+417.998299265" watchObservedRunningTime="2026-04-23 08:19:30.354332425 +0000 UTC m=+417.999117068" Apr 23 08:19:30.643786 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.643759 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-75b96b49cf-p7vvq_9c63121f-0c3a-4415-aa44-6b0db0179fc8/registry/0.log" Apr 23 08:19:30.710097 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:30.710071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-stmzm_152c2fa0-6d86-426a-bbda-00226a6a9cc0/node-ca/0.log" Apr 23 08:19:31.657285 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:31.657261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p8pm2_6d10c1ba-7b11-4a83-938c-04443e1047c2/serve-healthcheck-canary/0.log" Apr 23 08:19:32.144025 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:32.144000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwwgv_d9f7bef3-14ef-4112-840d-8b9820e79e4b/kube-rbac-proxy/0.log" Apr 23 08:19:32.167859 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:32.167834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwwgv_d9f7bef3-14ef-4112-840d-8b9820e79e4b/exporter/0.log" Apr 23 08:19:32.188317 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:32.188296 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwwgv_d9f7bef3-14ef-4112-840d-8b9820e79e4b/extractor/0.log" Apr 23 08:19:36.181738 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:36.181707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-g944w_e4d55644-7fda-4d49-b10d-7977a14862de/kube-storage-version-migrator-operator/1.log" Apr 23 08:19:36.182590 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:36.182575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-g944w_e4d55644-7fda-4d49-b10d-7977a14862de/kube-storage-version-migrator-operator/0.log" Apr 23 08:19:36.329089 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:36.329057 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-lvjwj" Apr 23 08:19:37.035106 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.035077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/kube-multus-additional-cni-plugins/0.log" Apr 23 08:19:37.056018 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.055992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/egress-router-binary-copy/0.log" Apr 23 08:19:37.075795 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.075769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/cni-plugins/0.log" Apr 23 08:19:37.096823 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.096801 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/bond-cni-plugin/0.log" Apr 23 08:19:37.116793 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.116771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/routeoverride-cni/0.log" Apr 23 08:19:37.185405 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.185378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/whereabouts-cni-bincopy/0.log" Apr 23 08:19:37.212010 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.211977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g7chr_42d0b805-f001-437e-a62c-21317c5168f5/whereabouts-cni/0.log" Apr 23 08:19:37.410345 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.410314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgplt_18cb3bcf-df53-4ee2-abe1-5fd5156e3bc1/kube-multus/0.log" Apr 23 08:19:37.543845 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.543812 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fflkd_df2ff433-01c3-442f-b962-0dbfe4dd622f/network-metrics-daemon/0.log" Apr 23 08:19:37.570161 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:37.570138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fflkd_df2ff433-01c3-442f-b962-0dbfe4dd622f/kube-rbac-proxy/0.log" Apr 23 08:19:38.734606 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.734584 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/ovn-controller/0.log" Apr 23 08:19:38.758754 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.758687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/ovn-acl-logging/0.log" Apr 23 08:19:38.781920 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.781883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/kube-rbac-proxy-node/0.log" Apr 23 08:19:38.802398 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.802355 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 08:19:38.819317 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.819301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/northd/0.log" Apr 23 08:19:38.843975 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.843958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/nbdb/0.log" Apr 23 08:19:38.866070 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.866054 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/sbdb/0.log" Apr 23 08:19:38.961115 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:38.961088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgndg_2c90cc59-7d96-4997-8dd0-3c2ac01d264d/ovnkube-controller/0.log" Apr 23 08:19:40.184517 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:40.184491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-fm4xg_39e863f9-7f2e-4939-8af7-7376f6f63bb0/check-endpoints/0.log" Apr 23 08:19:40.234195 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:40.234170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jb8d5_9e38d3e1-2d82-4dcf-a3b0-58b8c0571ff9/network-check-target-container/0.log" Apr 23 08:19:41.087544 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:41.087513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-ncw6m_724188f0-71e3-4a41-97c0-51d0f88e6c75/iptables-alerter/0.log" Apr 23 08:19:41.636747 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:41.636718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5xcj5_54fc82b6-757a-4606-81ce-75c113d9a233/tuned/0.log" Apr 23 08:19:44.079991 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.079963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-v67lr_32fba226-f299-4daf-9b75-93ade820fb8b/service-ca-operator/1.log" Apr 23 08:19:44.080847 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.080826 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-v67lr_32fba226-f299-4daf-9b75-93ade820fb8b/service-ca-operator/0.log" Apr 23 08:19:44.357541 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.357459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-h9tbn_cde94229-f202-4dfb-829e-a8b6643aa642/service-ca-controller/0.log" Apr 23 08:19:44.691155 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.691130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fqwjw_522a528b-85f9-4e19-a62b-b53d7868c26e/csi-driver/0.log" Apr 23 08:19:44.711383 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.711360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fqwjw_522a528b-85f9-4e19-a62b-b53d7868c26e/csi-node-driver-registrar/0.log" Apr 23 08:19:44.731749 ip-10-0-139-180 kubenswrapper[2576]: I0423 08:19:44.731728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-fqwjw_522a528b-85f9-4e19-a62b-b53d7868c26e/csi-liveness-probe/0.log"